kafka消费者无法从kafka主题订阅(通过spark streaming运行)

时间:2017-08-11 03:35:36

标签: apache-kafka spark-streaming kafka-consumer-api sbt-assembly

创建道具对象后设置消费者的代码

val consumer = new KafkaConsumer[String, String](props)
consumer.subscribe(util.Arrays.asList(topic))

代码导入如下

package main.scala
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.kafka.clients.consumer.KafkaConsumer
import java.util
import java.util.Properties
import org.apache.kafka.clients.consumer.{ConsumerConfig, KafkaConsumer}
import java.io.IOException

我通过sbt

创建了一个程序集jar
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" % "provided" 
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.0" % "provided" 
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.6.0" libraryDependencies += "org.apache.kafka" % "kafka_2.10" % "0.10.0-kafka-2.1.1"

你能告诉我这里缺少什么

错误讯息 -

用户类抛出异常:java.lang.NoSuchMethodError:org.apache.kafka.clients.consumer.KafkaConsumer.subscribe(Ljava / util / Collection;)V

2 个答案:

答案 0 :(得分:0)

我在Spark 2.2.0和kafka 0.10.0上遇到了同样的问题 问题是因为s​​park2-submit(也是spark2-shell)中的kafka默认版本不同

我找到了决定here

1. Before spark2-submit you have to export kafka version
$ export SPARK_KAFKA_VERSION=0.10
$ spark2-submit ...

答案 1 :(得分:-2)

subscribe接受java.util.Collections类型的输入,而不是java.util.Arrays.asList

尝试

consumer.subscribe(java.util.Arrays.asList("topic"))

它应该有用......