线程“main”中的异常java.lang.NoSuchMethodError:scala.Predef $ .refArrayOps(

时间:2017-04-03 19:37:39

标签: java scala apache-spark intellij-idea sparkcore

我是scala的新手,并且在INTELLJ中获取以下代码的错误,任何人都可以帮助解决它

     import org.apache.spark.{SparkContext, SparkConf}
     object wordcount {
     def main(args: Array[String])
      {
      val conf = new SparkConf()
      .setMaster("local[*]")
      .setAppName("TestSpark")
      .set("spark.executor.memory","2g")

       val sc = new SparkContext(conf)
       val a  = sc.parallelize(Seq("This is the firstline", "This is the  
       second line", "This is the third line"))              
       val count = a.flatMap(x => x.split(" "))
       val counts = count.map(word => (word,1)).reduceByKey((x,y) => x+y)
        counts.foreach(println)

       }

       }

我收到以下错误:

      Exception in thread "main" java.lang.NoSuchMethodError:    
     scala.Predef$.refArrayOps([Ljava/lang/Object;)
      Lscala/collection/mutable/ArrayOps;
      at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1342)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
     at wordcount$.main(wordcount.scala:12)
     at wordcount.main(wordcount.scala)
     Using Spark's default log4j profile: org/apache/spark/log4j-    

以下是我建的SBT

      name := "scalaprograms"

      version := "1.0"

       scalaVersion := "2.12.1"

      libraryDependencies += "org.apache.spark" % "spark-core_2.11" %

1 个答案:

答案 0 :(得分:9)

您应该使用Scala 2.11才能使用spark-core_2.11。即使用:

scalaVersion := "2.11.8"

AFAIk Spark无法使用Scala 2.12