我正在使用Spark Shell(Scala 2.10和Spark Streaming org.apache.spark:spark-streaming-kafka-0-10_2.10:2.0.1
)来测试Spark / Kafka消费者:
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.dstream.DStream
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "mykafka01.example.com:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "mykafka",
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val topics = Array("mytopic")
def createKafkaStream(ssc: StreamingContext, topics: Array[String], kafkaParams: Map[String,Object]) : DStream[(String, String)] = {
KafkaUtils.createDirectStream[String, String](ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams))
}
def messageConsumer(): StreamingContext = {
val ssc = new StreamingContext(SparkContext.getOrCreate(), Seconds(10))
createKafkaStream(ssc, topics, kafkaParams).foreachRDD(rdd => {
rdd.collect().foreach { msg =>
try {
println("Received message: " + msg._2)
} catch {
case e @ (_: Exception | _: Error | _: Throwable) => {
println("Exception: " + e.getMessage)
e.printStackTrace()
}
}
}
})
ssc
}
val ssc = StreamingContext.getActiveOrCreate(messageConsumer)
ssc.start()
ssc.awaitTermination()
当我运行时,我得到以下异常:
<console>:60: error: type mismatch;
found : org.apache.spark.streaming.dstream.InputDStream[org.apache.kafka.clients.consumer.ConsumerRecord[String,String]]
required: org.apache.spark.streaming.dstream.DStream[(String, String)]
KafkaUtils.createDirectStream[String, String](ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams))
^
我一遍又一遍地检查了Scala / API文档,这段代码看起来就像它应该正确执行一样。知道我哪里出错了吗?
答案 0 :(得分:4)
Subscribe
将topics
参数作为Array[String]
,您按照def createKafkaStream(ssc: StreamingContext, topics: String,
传递一个字符串。将参数类型更改为Array[String]
(并适当调用它)将解决问题。