为什么以下行与KafkaUtils.createStream
val reciver = KafkaUtils.createStream[String, String , StringDecoder, StringDecoder](ssc, kafkaParams, topics).map(_._2)
给我error: overloaded method value createStream with alternatives:
?
error: overloaded method value createStream with alternatives:
(jssc: org.apache.spark.streaming.api.java.JavaStreamingContext,keyTypeClass: Class[String],valueTypeClass: Class[String],keyDecoderClass: Class[kafka.serializer.StringDecoder],valueDecoderClass: Class[kafka.serializer.StringDecoder],kafkaParams: java.util.Map[String,String],topics: java.util.Map[String,Integer],storageLevel: org.apache.spark.storage.StorageLevel)org.apache.spark.streaming.api.java.JavaPairReceiverInputDStream[String,String] <and> (ssc: org.apache.spark.streaming.StreamingContext,kafkaParams: scala.collection.immutable.Map[String,String],topics: scala.collection.immutable.Map[String,Int],storageLevel: org.apache.spark.storage.StorageLevel)(implicit evidence$1: scala.reflect.ClassTag[String], implicit evidence$2: scala.reflect.ClassTag[String], implicit evidence$3: scala.reflect.ClassTag[kafka.serializer.StringDecoder], implicit evidence$4: scala.reflect.ClassTag[kafka.serializer.StringDecoder])org.apache.spark.streaming.dstream.ReceiverInputDStream[(String, String)]
cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.immutable.Map[String,String], scala.collection.immutable.Set[String])
KafkaUtils.createStream(ssc, kafkaParams, topics)
^
答案 0 :(得分:0)
无法应用于(org.apache.spark.streaming.StreamingContext,scala.collection.mutable.Map [String,String],scala.collection.immutable.Set [String])
请注意与KafkaUtils.createStream支持的类型不匹配的类型。
您似乎最接近以下内容:
createStream[K, V, U <: Decoder[_], T <: Decoder[_]](
ssc: StreamingContext,
kafkaParams: Map[String, String],
topics: Map[String, Int],
storageLevel: StorageLevel)(/** implicits removed */): ReceiverInputDStream[(K, V)]
且您的topics
类型为Set
,而非Map
(!)
使用官方KafkaWordCount示例获取支持。