Spark streaming StreamingContext.start() - 启动接收器0时出错

时间:2015-01-14 11:23:38

标签: scala apache-spark apache-kafka spark-streaming

我有一个使用火花流的项目,我正在使用'spark-submit'运行它,但我遇到了这个错误:

15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
    at org.apache.spark.Logging$class.log(Logging.scala:52)
    at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
    at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
    at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
    at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
    at org.apache.spark.scheduler.Task.run(Task.scala:54)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

这是错误来自的代码,一切运行正常,直到ssc.start()

    val Array(zkQuorum, group, topics, numThreads) = args
    val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
    val ssc = new StreamingContext(sparkConf, Seconds(2))
    ssc.checkpoint("checkpoint")
    .
    .
    .
    ssc.start()
    ssc.awaitTermination()

我使用'spark-submit'运行SparkPi示例,它运行正常,所以我似乎无法弄清楚导致我的应用程序出现问题的原因,任何帮助都会非常感激。

1 个答案:

答案 0 :(得分:3)

来自java.lang.AbstractMethod的文档:

  

通常,编译器会捕获此错误;这个错误只能   如果某个类的定义不兼容,则在运行时发生   自上次编译当前执行的方法以来已更改。

这意味着编译和运行时依赖项之间存在版本不兼容。请确保对齐这些版本以解决此问题。