Spark:com.mysql.jdbc.Driver不允许将create table作为select

时间:2015-10-07 15:58:18

标签: mysql jdbc apache-spark apache-spark-sql pyspark

尝试通过spark保存到MySQL数据库时出现以下错误:

Py4JJavaError: An error occurred while calling o41.saveAsTable.
: java.lang.RuntimeException: com.mysql.jdbc.Driver does not allow create table as select.
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:242)
    at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:218)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1091)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:745)

我的python代码:

res.saveAsTable(tableName='test.provider_phones', source='com.mysql.jdbc.Driver',driver='com.mysql.jdbc.Driver', mode='append', url='jdbc:mysql://host.amazonaws.com:port/test?user=user&password=pass')

无论表格是否已存在,都会发生这种情况。

我正在使用spark 1.3.1

2 个答案:

答案 0 :(得分:1)

您可以使用createJDBCTable(url: String, table: String, allowExisting: Boolean)的{​​{1}}或insertIntoJDBC(url: String, table: String, overwrite: Boolean)功能。

http://www.sparkexpert.com/2015/04/17/save-apache-spark-dataframe-to-database/

答案 1 :(得分:1)

不幸的是,这在pyspark 1.3.1中是不可能的。我的解决方案是切换到scala,然后使用DataFrame.insertIntoJDBC