Kafka Direct Stream to DataFrame无法与窗口一起使用

时间:2019-01-16 08:22:48

标签: scala apache-spark spark-streaming rdd

我有给定的代码。我正在使用Zeppelin 0.8.0和org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.1。在Spark 2.3.1上运行此代码。

stream.window(Minutes(5),Seconds(20)).foreachRDD { rdd =>
    val lines = rdd.map(record => record.value())
    val words = lines.flatMap(line => line.split(" "))
    val pairs = words.map(word => (word, 1))
    val wordCounts = pairs.reduceByKey((x: Int, y: Int) => (x + y))
    wordCounts.toDF("word", "count").createOrReplaceTempView("words")
}

但是,当我尝试在滑动窗口中查询单词表时,出现以下错误:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0 in stage 540.0 (TID 3036) had a not serializable result: org.apache.kafka.clients.consumer.ConsumerRecord
Serialization stack:
    - object not serializable (class: org.apache.kafka.clients.consumer.ConsumerRecord, value: ConsumerRecord(topic = test, partition = 0, offset = 249, CreateTime = 1547626717449, checksum = 3583250337, serialized key size = -1, serialized value size = 4, key = null, value = test))

关于如何使它工作的任何建议?似乎是一个非常基本的示例。

如果我在没有窗口功能的情况下运行它,它将正常工作。

1 个答案:

答案 0 :(得分:0)

显然,您需要将transform(...)中的流转换为没有ConsumerRecord的流。然后,您可以在清理后的流上调用window。然后,您可以转换该流并构建结果表。

val cleanedStream = kafkaStream.transform(rdd => rdd.map(record => record.value))

val windowedStream = cleanedStream.window(Minutes(5),Seconds(20))

val transformedStream = windowedStream.transform(rdd => {
    val words = rdd.flatMap(line => line.split(" "))
    val pairs = words.map(word => (word, 1))
    pairs.reduceByKey((x: Int, y: Int) => (x + y))
})

transformedStream.foreachRDD { rdd =>
    rdd.toDF("word", "count").createOrReplaceTempView("words")
}