我使用intelliJ创建了一个sbt项目。我在项目的sqljdbc42.jar
文件夹中复制了所需的jdbc jar lib
。 sbt package
成功完成。我在 Windows 上以spark-shell --driver-class-path C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar
启动了火花。
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.sql._
object ConnTest extends App {
val conf = new SparkConf()
val sc = new SparkContext(conf.setAppName("Test").setMaster("local[*]"))
// The following four statements work if running interactively in the Spark shell
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val jdbcSqlConn = "jdbc:sqlserver://...;databaseName=...;user=...;password=...;"
val jdbcDf = sqlContext.read.format("jdbc").options(Map(
"url" -> jdbcSqlConn,
"dbtable" -> "testTable"
)).load()
jdbcDf.show(10)
sc.stop()
}
但是,以下spark-submit
命令会出错。
spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar
spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar --jars \sqljdbc_6.0\enu\jre8\sqljdbc42.jar
Exception in thread "main" java.sql.SQLException: No suitable driver at java.sql.DriverManager.getDriver(Unknown Source) at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84) at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:83) at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:34) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125) at ConnTest$.delayedEndpoint$ConnTest$1(main.scala:14) at ConnTest$delayedInit$body.apply(main.scala:6) at scala.Function0$class.apply$mcV$sp(Function0.scala:34) at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.collection.immutable.List.foreach(List.scala:381) at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) at scala.App$class.main(App.scala:76) at ConnTest$.main(main.scala:6) at ConnTest.main(main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
更新 火花代码工作,如果我直接在Spark shell中运行语句,我可以看到表格内容。
更新2:
运行spark-submit
17/05/15 16:12:30 INFO SparkContext:在spark://10.8.159.130:7587 / jars / sqljdbc42中添加了JAR文件:/ C:/sqljdbc_6.0/enu/jre8/sqljdbc42.jar。 jar时间戳1494879150052
答案 0 :(得分:2)
设置其他选项解决了问题。
"driver" -> "com.microsoft.sqlserver.jdbc.SQLServerDriver",
答案 1 :(得分:0)
尝试的选项很少:
一个。编辑spark-defaults.conf并修改这些字段:
spark.driver.extraClassPath /path/to/jar/*
spark.executor.extraClassPath /path/to/jar/*
B中。在代码中设置路径:
val conf = new SparkConf()
conf.set("spark.driver.extraClassPath", "/path/to/jar/*")
val sc = new SparkContext(conf)
℃。尝试使用--jars=local:
或--jars "C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"
在Windows上运行Spark时,相应地编辑jar路径。
spark-submit.cmd --class ConnTest --master local[4] .\target\scala-2.11\test_2.11-1.0.jar --jars=local:C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar