Spark Kafka Streaming给出了不兼容的Jackson异常

时间:2017-02-03 19:55:59

标签: scala apache-spark jackson apache-kafka spark-streaming

这是我得到的错误:

java.lang.ExceptionInInitializerError
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:78)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:62)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:150)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:127)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.createStreamAndTransform(StreamingClassificationPipeline.scala:69)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.streamAndUpdateMLModelCache(StreamingClassificationPipeline.scala:58)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.run(StreamingClassificationPipeline.scala:54)
    at chirpanywhere.stream.classification.service.Boot$.main(Boot.scala:10)
    at chirpanywhere.stream.classification.service.Boot.main(Boot.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.7.8
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:730)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
    at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:78)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:62)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:150)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:127)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.createStreamAndTransform(StreamingClassificationPipeline.scala:69)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.streamAndUpdateMLModelCache(StreamingClassificationPipeline.scala:58)
    at chirpanywhere.stream.classification.service.streaming.StreamingClassificationPipeline.run(StreamingClassificationPipeline.scala:54)
    at chirpanywhere.stream.classification.service.Boot$.main(Boot.scala:10)
    at chirpanywhere.stream.classification.service.Boot.main(Boot.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)

有人可以告诉我这是什么问题。

我的build.sbt看起来像这样:

   val akkaHttpV = "10.0.0"
  val sparkV = "2.1.0"

 "log4j" % "log4j" % "1.2.17",
    "com.typesafe.akka" %% "akka-http" % akkaHttpV,
    "com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpV,
    "org.slf4j" % "slf4j-api" % "1.7.21",
    "org.apache.spark" %% "spark-core" % sparkV,
    "org.apache.spark" %% "spark-sql" % sparkV,
    "org.apache.spark" %% "spark-streaming" % sparkV,
    "org.apache.spark" %% "spark-mllib" % sparkV,
    "com.github.blemale" %% "scaffeine" % "2.0.0" % "compile",
    "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkV ,
    "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-M3",
    "org.json4s" %% "json4s-native" % "3.5.0" ,
    "org.scalatest" %% "scalatest" % "3.0.0" % "test",

1 个答案:

答案 0 :(得分:2)

在检查随其部署的jackson-databind版本时,Jackson Scala模块会抛出异常。 Scala模块需要相同的次要版本;在这种情况下异常消息是不完整的(它应该包括两个版本) - 我猜你有更新的jackson-databind(2.7.8是一个很好的最新版本),和较旧的Scala模块,也许一个Spark捆绑(2.5。 3?)。

Scala模块在这里非常谨慎,经常足够接近&#34;次要版本可以工作:但解决方案是确保databind和Scala模块具有相同的次要版本(也可能同样的补丁版本)。