AbstractMethodError创建Kafka流

时间:2018-03-08 19:22:55

标签: scala apache-spark apache-kafka spark-streaming

我正在尝试使用createDirectStream方法打开Kafka(已尝试的版本0.11.0.2和1.0.1)流并获取此AbstractMethodError错误:

Exception in thread "main" java.lang.AbstractMethodError
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.initializeLogIfNecessary(KafkaUtils.scala:39)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.log(KafkaUtils.scala:39)
    at org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.logWarning(KafkaUtils.scala:39)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.fixKafkaParams(KafkaUtils.scala:201)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:63)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)

这就是我的称呼方式:

val preferredHosts = LocationStrategies.PreferConsistent
    val kafkaParams = Map(
      "bootstrap.servers" -> "localhost:9092",
      "key.deserializer" -> classOf[IntegerDeserializer],
      "value.deserializer" -> classOf[StringDeserializer],
      "group.id" -> groupId,
      "auto.offset.reset" -> "earliest"
    )

    val aCreatedStream = createDirectStream[String, String](ssc, preferredHosts,
      ConsumerStrategies.Subscribe[String, String](topics, kafkaParams))

我有Kafka在9092运行,我能够创建生产者和消费者,并在他们之间传递消息,因此不确定为什么它不能使用Scala代码。任何想法都赞赏。

3 个答案:

答案 0 :(得分:17)

原来我使用的是Spark 2.3,我应该使用Spark 2.2。显然这个方法在后来的版本中是抽象的,所以我得到了那个错误。

答案 1 :(得分:5)

我遇到了同样的异常,在我的情况下,我尝试部署到Spark 2.3.0集群时创建了依赖于spark-streaming-kafka-0-10_2.11版本2.1.0的应用程序jar。

答案 2 :(得分:0)

我收到了同样的错误。我将依赖项设置为与我的Spark解释器相同的版本

%spark2.dep
z.reset()
z.addRepo("MavenCentral").url("https://mvnrepository.com/")

z.load("org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.0")
z.load("org.apache.kafka:kafka-clients:2.3.0")