如何在Spark中修复此异常“ java.lang.ExceptionInInitializerError”

时间:2019-04-28 15:14:06

标签: java apache-spark

在我自己的数据上运行它之前,我在下面的链接中使用此代码来测试Spark

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaFPGrowthExample.java

这些是我对hadoop和spark的依赖项:

org.apache.spark spark-mllib_2.12 2.4.0 runtime

org.apache.hadoop hadoop-common 3.2.0

我收到此异常:

Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3038)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3036)
at org.apache.spark.ml.util.Instrumentation.logDataset(Instrumentation.scala:60)
at org.apache.spark.ml.fpm.FPGrowth.$anonfun$genericFit$1(FPGrowth.scala:169)
at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:183)
at scala.util.Try$.apply(Try.scala:209)
at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:183)
at org.apache.spark.ml.fpm.FPGrowth.genericFit(FPGrowth.scala:165)
at org.apache.spark.ml.fpm.FPGrowth.fit(FPGrowth.scala:162)
at Spark.main(Spark.java:45)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.5
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:51)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 14 more

这是一个已知问题,如何解决?

注意:

我在Scala中遇到了同样的问题

Spark2.1.0 incompatible Jackson versions 2.7.6

但是我无法在Java中做同样的事情

0 个答案:

没有答案