从KafkaUtils创建RDD时出错

时间:2018-08-01 09:22:52

标签: apache-spark apache-kafka rdd

以下是我要实现的目标:

     val kafkaParams =  Map[String, Object](
    "bootstrap.servers" -> "localhost:9092",
    "key.deserializer" -> "org.apache.kafka.common.serialization.StringSerializer",
    "group.id" -> "mygroup",
    "value.deserializer" -> "org.apache.kafka.common.serialization.StringDeserializer"
  )
      val sparkConf = new SparkConf().setAppName("StructuredStreaming").setMaster("local[2]")
      sparkConf.set("spark.driver.allowMultipleContexts", "true")
      val sc = new SparkContext(sparkConf)
      val offsetRanges = Array(
        // topic, partition, inclusive starting offset, exclusive ending offset
        OffsetRange("test", 0, 0, 100),
        OffsetRange("test", 1, 0, 100)
      )
      val rdd1 = KafkaUtils.createRDD[String,String](sc,kafkaParams,offsetRanges,PreferConsistent)

我在这里遇到错误:

  

无法解析符号CreateRDD

当我从createRDD方法中删除[Key,Value]的类型时,内部参数无法识别。

0 个答案:

没有答案