使用idea执行spark scala代码错误

时间:2017-02-01 16:19:57

标签: java scala maven intellij-idea

我使用idea在scala中编写了一个简单的spark应用程序,并在运行时收到错误消息:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1406)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:76)
at com.chandler.hellow_world_b6$.main(hellow_world_b6.scala:13)
at com.chandler.hellow_world_b6.main(hellow_world_b6.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

流程以退出代码1结束 代码是:

import org.apache.spark.{SparkContext,SparkConf}
object hellow_world_b6{
    def main(args: Array[String]): Unit = {
        println( "Hello World   12!")
        val conf=new SparkConf()
        val sc=new SparkContext(conf)
    }
}

maven configure是:

<properties>
    <scala.version>2.12.1</scala.version>
</properties>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>${scala.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.0</version>
</properties>

0 个答案:

没有答案