我正在尝试使用Spark与Datastax Spark-Cassandra连接器查询Cassandra。 Spark代码是
val conf = new SparkConf(true)
.setMaster("local[4]")
.setAppName("cassandra_query")
.set("spark.cassandra.connection.host", "mycassandrahost")
val sc = new SparkContext(conf)
val rdd = sc.cassandraTable("mykeyspace", "mytable").limit(10)
rdd.foreach(println)
sc.stop()
所以它现在正在本地运行。我的build.sbt文件看起来像
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"cc.mallet" % "mallet" % "2.0.7",
"com.amazonaws" % "aws-java-sdk" % "1.11.229",
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0"
)
我使用程序集插件创建一个胖jar,当我提交spark作业时,我收到以下错误
Lost task 6.0 in stage 0.0 (TID 6) on executor localhost: java.io.IOException (Exception during preparation of SELECT "pcid", "content" FROM "mykeyspace"."mytable" WHERE token("pcid") > ? AND token("pcid") <= ? LIMIT 10 ALLOW FILTERING: class java.time.LocalDate in JavaMirror with org.apache.spark.util.MutableURLClassLoader@923288b of type class org.apache.spark.util.MutableURLClassLoader with classpath [file:/root/GenderPrediction-assembly-0.1.jar] and parent being sun.misc.Launcher$AppClassLoader@1e69dff6 of type class sun.misc.Launcher$AppClassLoader with classpath [file:/root/spark/conf/,file:/root/spark/jars/datanucleus-core-3.2.10.jar,...not found.
(注意:上面的类路径中列出了太多的jar,所以我只用&#34替换它们; ...&#34;)
所以它看起来无法找到java.time.LocalDate
- 我该如何解决这个问题?
我发现另一篇看起来类似的帖子spark job cassandra error 然而,这是一个无法找到的不同类,所以我不确定它是否有帮助。
答案 0 :(得分:2)
java.time.LocalDate
是Java8的一部分,似乎你运行的Java版本低于8。
spark-cassandra-connector 2.0需要java 8。 Spark Cassandra version compatibility
答案 1 :(得分:1)
你能试试吗
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"cc.mallet" % "mallet" % "2.0.7",
"com.amazonaws" % "aws-java-sdk" % "1.11.229",
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0" exclude("joda-time", "joda-time"),
"joda-time" % "joda-time" % "2.3"
)