运行spark scala示例失败

时间:2014-10-14 01:29:33

标签: scala intellij-idea apache-spark

我是Spark和Scala的新手。我已经使用SBT创建了一个IntelliJ Scala项目,并为build.sbt添加了几行。

name := "test-one"

version := "1.0"

scalaVersion := "2.11.2"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"

我的Scala版本是2.10.4但2.11.2

也会出现此问题
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
    at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
    at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
    at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
    at TweeProcessor$.main(TweeProcessor.scala:10)
    at TweeProcessor.main(TweeProcessor.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 23 more

尝试在线查找,大多数答案都指出API版本和Scala版本之间不匹配,但没有一个特定于Spark。

4 个答案:

答案 0 :(得分:23)

spark-core_2.10用于2.10.x版本的scala。你应该使用

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"

将为您的scala版本选择正确的_2.10或_2.11版本。

另外,请确保您正在编译与您正在运行此功能的群集上相同版本的scala和spark。

答案 1 :(得分:10)

将scala版本降级为2.10.4

name := "test-one"

version := "1.0"

//scalaVersion := "2.11.2"
scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"

答案 2 :(得分:1)

这是版本兼容性问题。 Spark_core 2.10是使用scala 2.10构建的,而你的sbt文件提到你正在使用scala 2.11。要么将scala版本降级到2.10,要么将火花升级到2.11

答案 3 :(得分:1)

scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
        "org.apache.spark" % "spark-core_2.11" % "2.2.0",
        "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
         )

此配置对我有用。