如何使用pycharm将spark与mysql连接起来

时间:2016-06-24 12:17:58

标签: python mysql jdbc apache-spark pycharm

我尝试从mysql数据库中选择一个表格内容到数据帧, 我按照这个步骤将spark与mysql连接起来。

  1. 下载mysql-connector-java-5.0.8-bin.jar
  2. 我将mysql-connector-java-5.0.8-bin.jar放在路径$SPARK_HOME/bin/mysql-connector-java-5.0.8-bin.jar 但它仍然无法正常工作
  3. 代码:

    from pyspark import SparkContext
    from pyspark.sql import SQLContext, Row
    
    sc = SparkContext()
    sqlctx = SQLContext(sc)
    
    
     dataframe_mysql = sqlctx.read.format("jdbc").options(
        url="jdbc:mysql://localhost:3306/database",
        driver="com.mysql.jdbc.Driver",
        dbtable="user",
        user="root",
        password="").load()
    

    之后我尝试使用cmd将spark与mysql连接起来 使用-jars参数

    启动pyspark shell
    $SPARK_HOME/bin/pyspark  –-jars mysql-connector-java-5.0.8-bin.jar
    

    也不行 我不明白为什么使用driver =“com.mysql.jdbc.Driver” 连接到mysql 谁能帮我  提前谢谢

    我收到了这个错误

    File "C:/Users/kcs/PycharmProjects/Flunky/SparkMySql.py", line 14, in <module>
      password="").load()
      File "C:\DataScience\python\pyspark\sql\readwriter.py", line 123, in load
      return self._df(self._jreader.load())
      File "C:\DataScience\python\lib\py4j-0.8.2.1-          src.zip\py4j\java_gateway.py", line 538, in __call__
      File "C:\DataScience\python\pyspark\sql\utils.py", line 36, in deco
       return f(*a, **kw)
       File "C:\DataScience\python\lib\py4j-0.8.2.1-src.zip\py4j\protocol.py",     line 300, in get_return_value
     py4j.protocol.Py4JJavaError: An error occurred while calling o24.load.
     : java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:38)
    at org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:41)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Unknown Source)
    

    and the error on cmd is

2 个答案:

答案 0 :(得分:1)

看看下载下载mysql-connector-java-5.0.8-bin.jar并在你的代码中启动shell $ SPARK_HOME / bin / pyspark --jars mysql-connector-java-5.1.8-bin.jar你的mysql驱动程序版本不一样,所以你可以在修复后检查一下。

答案 1 :(得分:0)

当我使用pyCharm Spark SQL连接MySQL时,运行时出现此错误。

sc = SparkContext(appName="mysqltest")
sqlContext = SQLContext(sc)
df = sqlContext.read.format("jdbc")\
    .options(url="jdbc:mysql://localhost:3306/test",
             driver="com.mysql.jdbc.Driver", dbtable="user", user="root", password="root")\
    .load()
df.show()
sc.stop()

将mysql-connector-java-5.0.8-bin.jar添加到%SPARK_HOME%jars中,解决该错误