如何克服scala NoSuchMethodError?

时间:2016-06-09 14:29:02

标签: scala apache-spark

刚开始使用scala和spark 试图运行这个简单的程序:

package spark.example

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SparkGrep {
  def main(args: Array[String] = Array("default argument")) {
    val conf = new SparkConf().setAppName("SparkGrep").setMaster("localhost")
    val sc = new SparkContext(conf)
    val inputFile = sc.textFile("/Users/eugene/Downloads/hello.txt").cache()
    val matchTerm : String = "hello"
    val numMatches = inputFile.filter(line => line.contains(matchTerm)).count()
    println("%s lines in %s contain %s".format(numMatches, args(1), matchTerm))
    System.exit(0)
  }
}

出现此错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
    at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
    at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
    at akka.actor.RootActorPath.$div(ActorPath.scala:152)
    at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
    at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
    at scala.util.Try$.apply(Try.scala:192)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at scala.util.Success.flatMap(Try.scala:231)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
    at spark.example.SparkGrep$.main(SparkGrep.scala:14)
    at spark.example.SparkGrep.main(SparkGrep.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

我应该在这个简单的代码中更改什么才能让它运行?

2 个答案:

答案 0 :(得分:2)

这是一个相当常见的错误,忘记了Spark是针对旧版本的Scala编译的(2.10.x,从Spark 1.6.1开始(这可能很快就会改变Spark 2.0)。)

更改代码以使用所述版本进行编译应该可以解决运行时问题。

答案 1 :(得分:0)

您可以使用Scala 2.11.x构建Spark 1.6.1
运用 ./dev/change-version-to-2.11.sh
./dev/change-scala-version.sh 2.11
然后呢 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
欲获得更多信息: http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211