使用spark可以将Hive用于Netezza数据导出

时间:2017-02-09 11:41:40

标签: apache-spark spark-dataframe netezza apache-spark-dataset

这封邮件将讨论我的团队正在使用的用例。 它将元数据和数据从HIVE服务器导出到RDBMS。

这样做,导出到MySQL和ORACLE工作正常,但导出到

Netezza失败并显示错误消息:

17/02/09 16:03:07 INFO DAGScheduler: Job 1 finished: json at RdbmsSandboxExecution.java:80, took 0.433405 s
17/02/09 16:03:07 INFO TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 143 ms on localhost (1/1)
17/02/09 16:03:07 INFO TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool
Exception in thread "main" java.sql.SQLException: No suitable driver
        at java.sql.DriverManager.getDriver(DriverManager.java:278)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnectionFactory(JdbcUtils.scala:49)
        at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:278)
        at org.apache.spark.sql.DataFrame.createJDBCTable(DataFrame.scala:1767)
        at com.zaloni.mica.datatype.conversion.RdbmsSandboxExecution.main(RdbmsSandboxExecution.java:81)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/02/09 16:03:07 INFO SparkContext: Invoking stop() from shutdown hook
17/02/09 16:03:07 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}
17/02/09 16:03:07 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL1/execution/json,null}

我们正在使用DataFrame.createJDBCTable。

我们使用的spark-submit命令是:

spark-submit --class <java_class_with_export_logic> --master local --deploy-mode client --conf spark.driver.extraClassPath=/absolute-path/nzjdbc3.jar --jars /absolute-path/nzjdbc3.jar /absolute-path/<application-jar <JDBC_URL>

0 个答案:

没有答案