java.lang.NoSuchMethodError Jackson databind和Spark

时间:2015-05-02 09:25:07

标签: json scala jackson apache-spark

我正在尝试使用Spark 1.1.0和Jackson 2.4.4运行spark-submit。我有scala代码,它使用Jackson将JSON反序列化为case类。这本身就可以正常工作,但是当我将它与spark一起使用时,我会收到以下错误:

15/05/01 17:50:11 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 2)
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.introspect.POJOPropertyBuilder.addField(Lcom/fasterxml/jackson/databind/introspect/AnnotatedField;Lcom/fasterxml/jackson/databind/PropertyName;ZZZ)V
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector.com$fasterxml$jackson$module$scala$introspect$ScalaPropertiesCollector$$_addField(ScalaPropertiesCollector.scala:109)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:100)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:99)
    at scala.Option.foreach(Option.scala:236)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:99)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:93)
    at scala.collection.GenTraversableViewLike$Filtered$$anonfun$foreach$4.apply(GenTraversableViewLike.scala:109)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.SeqLike$$anon$2.foreach(SeqLike.scala:635)
    at scala.collection.GenTraversableViewLike$Filtered$class.foreach(GenTraversableViewLike.scala:108)
    at scala.collection.SeqViewLike$$anon$5.foreach(SeqViewLike.scala:80)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector._addFields(ScalaPropertiesCollector.scala:93)

这是我的build.sbt:

//scalaVersion in ThisBuild := "2.11.4"
scalaVersion in ThisBuild := "2.10.5"

retrieveManaged := true

libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value

libraryDependencies ++= Seq(
  "junit" % "junit" % "4.12" % "test",
  "org.scalatest" %% "scalatest" % "2.2.4" % "test",
  "org.mockito" % "mockito-core" % "1.9.5",
  "org.specs2" %% "specs2" % "2.1.1" % "test",
  "org.scalatest" %% "scalatest" % "2.2.4" % "test"
)

libraryDependencies ++= Seq(
  "org.apache.hadoop" % "hadoop-core" % "0.20.2",
  "org.apache.hbase" % "hbase" % "0.94.6"
)

//libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"


libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.4"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.3.1"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.5.0"

libraryDependencies += "com.typesafe" % "config" % "1.2.1"

resolvers += Resolver.mavenLocal

如你所见,我尝试了很多不同版本的杰克逊。

这是我用来运行spark submit的shell脚本:

#!/bin/bash
sbt package

CLASS=com.org.test.spark.test.SparkTest

SPARKDIR=/Users/user/Desktop/
#SPARKVERSION=1.3.0
SPARKVERSION=1.1.0
SPARK="$SPARKDIR/spark-$SPARKVERSION/bin/spark-submit"

jar_jackson=/Users/user/scala_projects/lib_managed/bundles/com.fasterxml.jackson.module/jackson-module-scala_2.10/jackson-module-scala_2.10-2.4.4.jar

"$SPARK" \
  --class "$CLASS" \
  --jars $jar_jackson \
  --master local[4] \
  /Users/user/scala_projects/target/scala-2.10/spark_project_2.10-0.1-SNAPSHOT.jar \
  print /Users/user/test.json

我将--jars用于jackson jar的路径到spark-submit命令。我甚至尝试过不同版本的Spark。我甚至还指定了Jackson jars数据绑定,注释等的路径,但这并没有解决问题。任何帮助,将不胜感激。谢谢

5 个答案:

答案 0 :(得分:1)

我刚遇到杰克逊和火花的同样问题。当我使用SBT时,如user1077071,我按照以下步骤操作:

  1. 为SBT安装了优秀的依赖插件: https://github.com/jrudolph/sbt-dependency-graph
  2. 发现在我的情况下,play-json依赖于jackson 2.3
  3. 添加了杰克逊2.4。到我的libraryDependencies
  4. 我确实不得不对mulatiple jackson libs采用这种方法:core,annotations和databind。数据绑定是罪魁祸首,但也应该碰撞othrs以避免冲突。

    之后,它就像一个魅力。

答案 1 :(得分:0)

我认为的主要原因是你没有指定正确的依赖。

如果您使用第三方库,然后直接使用submit to Spark,则更好的方法是使用sbt-assemblyhttps://github.com/sbt/sbt-assembly)。

答案 2 :(得分:0)

获取方法...introspect.AnnotatedMember.annotations()的java.lang.NoSuchMethodError Jackson databind 通过将 jackson-databind 版本 2.9.0.pr3 的maven依赖关系更新为 2.9.1

来解决问题

答案 3 :(得分:0)

我有

java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonStreamContext.<init>(Lcom/fasterxml/jackson/core/JsonStreamContext;)V
更新com.fasterxml.jackson.core库从2.8.9到2.9.1

时出现

错误

就我而言,解决方案是查看gradle依赖项,并在build.gradle中排除所有冲突:

compile('org.springframework.boot:spring-boot-starter-web:1.5.7.RELEASE') {
    exclude group: "com.fasterxml.jackson.core"
}

compile('org.springframework.boot:spring-boot-starter-jdbc:1.5.7.RELEASE') {
    exclude group: "com.fasterxml.jackson.core"
}

compile('com.fasterxml.jackson.core:jackson-databind:2.9.1') {
    exclude module: "jackson-annotations"
    exclude module: "jackson-core"
}

compile('com.fasterxml.jackson.core:jackson-annotations:2.9.1')

compile('com.fasterxml.jackson.core:jackson-core:2.9.1')

compile 'org.scala-lang:scala-library:2.12.3'

compile('com.fasterxml.jackson.module:jackson-module-scala_2.12:2.9.1') {
    exclude group: "org.scala-lang"
    exclude module: "jackson-core"
    exclude module: "jackson-annotations"
    exclude module: "jackson-databind"
}

答案 4 :(得分:0)

如果您使用的是最新的Spark版本3.0.0-preview2,则以下配置适用于build.sbt

name := "scala-streams"

version := "0.1"

scalaVersion := "2.12.10"
val sparkVersion = "3.0.0-preview2"
val playVersion="2.8.1"

val jacksonVersion="2.10.1"

//override if you wish to
//dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % jacksonVersion
//dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "com.typesafe.play" %% "play-json" % playVersion
)