Scala错误Spark Streaming Kafka:对重载定义

时间:2016-06-03 22:03:59

标签: scala apache-spark spark-streaming

我正在尝试创建kafka直接流,并在我的Spark流模块中外部提供偏移量,但这会导致编译错误。

以下是创建Kafka直接流

的代码
val kafkaParams = Map("metadata.broker.list" -> "kafka.brokers")
// testing only
val fromOffsets: Map[TopicPartition, Long] = Map[TopicPartition, Long]()

val kafkaStream = KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, Array[Byte]]
    (ssc, kafkaParams, fromOffsets, (mmd: MessageAndMetadata[Array[Byte], Array[Byte]]) => mmd.message())

这是我遇到的编译错误。任何想法/指针?

    ambiguous reference to overloaded definition,
both method createDirectStream in object KafkaUtils of type (jssc: org.apache.spark.streaming.api.java.JavaStreamingContext, keyClass: Class[Array[Byte]], valueClass: Class[Array[Byte]], keyDecoderClass: Class[kafka.serializer.DefaultDecoder], valueDecoderClass: Class[kafka.serializer.DefaultDecoder], recordClass: Class[Array[Byte]], kafkaParams: java.util.Map[String,String], fromOffsets: java.util.Map[kafka.common.TopicAndPartition,Long], messageHandler: org.apache.spark.api.java.function.Function[kafka.message.MessageAndMetadata[Array[Byte],Array[Byte]],Array[Byte]])org.apache.spark.streaming.api.java.JavaInputDStream[Array[Byte]]
and  method createDirectStream in object KafkaUtils of type (ssc: org.apache.spark.streaming.StreamingContext, kafkaParams: Map[String,String], fromOffsets: Map[kafka.common.TopicAndPartition,Long], messageHandler: kafka.message.MessageAndMetadata[Array[Byte],Array[Byte]] => Array[Byte])(implicit evidence$14: scala.reflect.ClassTag[Array[Byte]], implicit evidence$15: scala.reflect.ClassTag[Array[Byte]], implicit evidence$16: scala.reflect.ClassTag[kafka.serializer.DefaultDecoder], implicit evidence$17: scala.reflect.ClassTag[kafka.serializer.DefaultDecoder], implicit evidence$18: scala.reflect.ClassTag[Array[Byte]])org.apache.spark.streaming.dstream.InputDStream[Array[Byte]]
match expected type ?
[ERROR]     val kafkaStream = KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, Array[Byte]]

1 个答案:

答案 0 :(得分:1)

请使用root to: "users#index" 代替kafka.common.TopicAndPartition