java.lang.NoSuchMethodError:scala.Product。$ init $(Lscala / Product;)V

时间:2019-04-29 15:47:41

标签: apache-spark

我的spark项目运行正常(2.4.0),但是当我添加下一个dependecy时:

            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.12</artifactId>
                <version>2.4.2</version>
                <scope>provided</scope>
             </dependency>

我有下一个堆栈跟踪:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at scala.xml.Null$.<init>(Null.scala:23)
    at scala.xml.Null$.<clinit>(Null.scala)
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:43)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:45)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:80)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:175)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:444)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at package.SparkUtils.initSession(SparkUtils.java:23)
    at package.MainClass.main(MainClass.java:80)

1 个答案:

答案 0 :(得分:0)

ScalaVersión2.12与2.11不兼容,因此请确保使用Spark Core ScalaVersión以及其他依赖项。

从上面看,游览流库正在使用Scala 2.12,另一个依赖项未在使用2.12