使用Spark Cassandra连接器的NoSuchMethodError

时间:2015-11-15 19:44:26

标签: apache-spark cassandra datastax spark-cassandra-connector

我正在尝试使用Spark 1.5.1和最后一个版本的spark-cassandra-connector(1.5.0-M2)来使用Cassandra 2.2.3(在localhost上运行)。

这是我正在使用的基本代码段代码。密钥空间和表已经创建。

import com.datastax.spark.connector._
import org.apache.spark.{SparkConf, SparkContext}

val conf = new SparkConf().setMaster("local[*]").set("spark.cassandra.connection.host", "localhost")
val sc = new SparkContext(conf)

val collection = sc.parallelize(Seq(("word1", 30), ("word2", 40)))
collection.saveToCassandra("test", "words", SomeColumns("word", "count"))

它编译没有任何问题与sbt汇编,但我在提交应用程序时收到此错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at Streamer$.main(Streamer.scala:33)
at Streamer.main(Streamer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

1 个答案:

答案 0 :(得分:1)

我使用scala_2.11,显然我的spark版本使用的是scala_2.10。因此切换到@RequestMapping(method = RequestMethod.PUT , consumes = {"application/x-www-form-urlencoded"} ,value = "/choice" ) public @ResponseBody String createXXXX(@RequestBody MultiValueMap params) throws Exception { System.out.println("params are " + params); return "hello"; } 中的scala_2.10对我有用。