带有PySpark错误TaskSetManager:70和错误ReceiverTracker:91的Kafka Consumer

时间:2019-06-21 23:13:58

标签: apache-spark apache-kafka spark-streaming kafka-consumer-api

我将推文消费到Kafka Producer中,然后与Kafka Consumer一起使用。我可以运行代码,但是每次运行时都会出现以下错误。

2019-06-21 19:07:57 ERROR TaskSetManager:70 - Task 0 in stage 450.0 failed 1 times; aborting job
2019-06-21 19:07:57 ERROR ReceiverTracker:91 - Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 450.0 failed 1 times, most recent failure: Lost task 0.0 in stage 450.0 (TID 384, localhost, executor driver): 

我当前用于将推文消费到PySpark中的代码是:

dataStream = KafkaUtils.createStream(ssc=ssc, zkQuorum='localhost:2181', groupId=0, topics={"tweetstream": 1}, kafkaParams=params, valueDecoder=lambda x: x.decode())

0 个答案:

没有答案