Spark上下文状态核心-无此类方法错误

时间:2019-04-15 07:04:49

标签: apache-spark apache-spark-sql apache-spark-2.0

一直试图在Spark 2.2中运行作业。 我遇到以下问题。无法解决此错误。 使用下面的JAR作为依赖项。

spark-sql_2.11-2.3.0.jar, spark-core_2.11-2.3.0.jar, spark-kvstore_2.11-2.3.0.jar, spark-hive_2.11-2.3.0.jar

错误:

> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.spark.SparkContext.statusStore()Lorg/apache/spark/status/AppStatusStore;
>         at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:91)
>         at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:117)
>         at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:117)
>         at scala.Option.getOrElse(Option.scala:121)
>         at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:117)
>         at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:116)
>         at org.apache.spark.sql.SparkSession.newSession(SparkSession.scala:236)

0 个答案:

没有答案