我正在Scala中运行spark程序,并遇到运行时错误:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at org.apache.spark.sql.SparkSession$Builder.config(SparkSession.scala:780)
at org.apache.spark.sql.SparkSession$Builder.appName(SparkSession.scala:771)
at tavant.user.userProcess$.getUser(userProcess.scala:10)
at tavant.user.call$delayedInit$body.apply(call.scala:4)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at tavant.user.call$.main(call.scala:3)
at tavant.user.call.main(call.scala)
我认为这是由于版本问题所致,以下是我现在在pom中使用righyt的版本控制:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0-cdh5.15.1</version>
</dependency>
此外,我使用的是2.10最新的Scala编译器动态捆绑包。我也尝试使用2.11,但再次出错。 有谁知道哪个scala版本与我当前使用的spark版本兼容。