KafkaStream中的数据键为null

时间:2018-10-19 08:10:18

标签: apache-spark apache-kafka spark-streaming

使用Spark Streaming通过直接流使用Apache Kafka主题时,KafkaStream中的数据键为null

def main(args: Array[String]): Unit = {
    val conf: SparkConf = new SparkConf().setAppName("sparktest").setMaster("local[2]")
    val sc = new SparkContext(conf)
    sc.setLogLevel("WARN")
    val util = new PropertiesUtil("/common.properties")
    val offsetreset = util.getProperty("Dataclean_offsetRest")
    val brokerlist = util.getProperty("brokerlist")
    val zookeeperCon = util.getProperty("zookeeperCon")
    val groupid: String = util.getProperty("Dataclean_groupid")
    val sparkinterval = util.getProperty("Dataclean_sparkinterval").toInt
    val topicStr: String = util.getProperty("Dataclean_topic")
    val ssc = new StreamingContext(sc, Seconds(sparkinterval))
    val topic = topicStr.split(",").toSet
    val kafkaParams = Map("serializer.class" -> "kafka.serializer.StringEncoder", "metadata.broker.list" -> brokerlist, "zookeeper.connect" -> zookeeperCon,
      "auto.offset.reset" -> "smallest", "group.id" -> groupid,
      "zookeeper.session.timeout.ms" -> "40000")
    val kakfaStream: InputDStream[(String, String)] = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topic)
    kakfaStream.transform(rdd=> {
      rdd.mapPartitions(records=> {
        records.map(json => {
          println("i am here")
          json._1
        })
      })
    }).print()
    ssc.start()
    ssc.awaitTermination()
  }
}

enter image description here

但是当我打印json._2时,我会得到以下结果:

enter image description here

0 个答案:

没有答案