错误SparkContext:初始化SparkContext时出错-Java + Eclipse + Spark

时间:2018-09-19 10:12:18

标签: java eclipse apache-spark apache-spark-sql

我现在从Spark开始。我正在尝试一些示例项目,现在正在处理要从csv读取的项目。 问题出在我运行应用程序时。 eclipse的控制台告诉我以下错误:

18/09/19 05:00:48 ERROR MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantiated
18/09/19 05:00:48 ERROR SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200)
    at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:194)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:194)
    at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:102)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
    at com.mycsv.app.CSVFileAnalysisInSparkSQL.main(CSVFileAnalysisInSparkSQL.java:28)
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.annotation.JsonFormat$Value.empty()Lcom/fasterxml/jackson/annotation/JsonFormat$Value;
    at com.fasterxml.jackson.databind.cfg.MapperConfig.<clinit>(MapperConfig.java:50)
    at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:543)
    at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:460)
    at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:48)
    ... 20 more

我的项目代码如下:

final SparkSession sparkSession = SparkSession.builder().appName("Spark CSV").master("local[5]").getOrCreate();

SparkContext可能是什么问题? 谢谢您的问候。

1 个答案:

答案 0 :(得分:0)

由于缺少或错误的杰克逊注释依赖关系而导致错误。请确认您在项目构建中使用的版本是否正确。