Spark程序抛出异常

时间:2018-05-31 18:16:20

标签: apache-spark exception

我有以下的spark程序,我在Windows(Eclipse / Maven)中运行它。

public class LoadingJson {

    public static void main(String[] args) {

        SparkSession ss = SparkSession.builder()
                         .appName("My First App in newer version of Spark")
                         .master("local")
                         .getOrCreate();
        ss.read()
          .json("G:\\users\\student.json").show();

    }

}

以下是POM文件:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>2.2.0</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.0</version>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-xml</artifactId>
        <version>2.11.0-M4</version>
    </dependency>

    <dependency>
        <groupId>org.scala-lang.modules</groupId>
        <artifactId>scala-xml_2.11</artifactId>
        <version>1.0.6</version>
    </dependency>

提交程序后,我收到以下异常。 你能帮忙解决这个例外。

Exception in thread "main" java.lang.NoSuchMethodError: 
scala.Predef$.$scope()Lscala/xml/TopScope$;
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
    at 

org.apache.spark.sql.SparkSession $ $$生成器$ anonfun 7.适用(SparkSession.scala:90    9)         在

org.apache.spark.sql.SparkSession $ $$生成器$ anonfun 7.适用(SparkSession.scala:90    1)         在scala.Option.getOrElse(Option.scala:120)         在     org.apache.spark.sql.SparkSession $ Builder.getOrCreate(SparkSession.scala:901)         在spark.LoadingJson.main(LoadingJson.java:14)

1 个答案:

答案 0 :(得分:2)

您需要使用相同版本的Scala。请注意,除Spark内核以外,您使用的是Scala 2.11:

spark-core_2.10

需要

spark-core_2.11