SparkStreaming任务失败,并显示NullPointerException

时间:2019-08-04 15:06:50

标签: kafka-producer-api spark-streaming-kafka

我正在使用kafka-clients:2.2.1和

运行spark作业。

spark-streaming-kafka-0-10_2.11:2.4.3

作业正在以下模式下运行:

val scc: StreamingContext = new StreamingContext(spark, "180 seconds")
val kafkaSink: Broadcast[KafkaSink] = ...
val stream = KafkaUtils.createDirectStream(...)
stream //. do some logic
.foreachRDD { (rdd, time) => 
     p.foreach { row =>
     val rowJson = convertRowToJSON(row)
     kafkaSink.value.send(kafkaOptions.topic, rowJson) }
}

该作业正在运行,但对于某些迷你批处理,有失败的任务,但以下情况除外:

java.lang.NullPointerException
at org.apache.kafka.clients.producer.KafkaProducer.waitOnMetadata(KafkaProducer.java:1010)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:916)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:897)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:774)
at com.outbrain.recs.rtap.kafka.KafkaSink.send(KafkaSink.scala:42)
at 

enter image description here 这些例外的原因可能是什么?

0 个答案:

没有答案