Spark Kafka - 从Eclipse IDE运行时出现问题

时间:2015-08-25 12:28:36

标签: eclipse scala apache-spark apache-kafka

我正在尝试使用Spark Kafka集成。我想测试我的eclipse IDE中的代码。但是,我得到了以下错误:

java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at kafka.utils.Pool.<init>(Pool.scala:28)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(FetchRequestAndResponseStats.scala:60)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(FetchRequestAndResponseStats.scala)
    at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
    at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:52)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:345)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:342)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
    at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:342)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:125)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:112)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:403)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:532)
    at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
    at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.executeStreamingCalculations(SparkTelemetryReceiverFromKafkaStream.java:248)
    at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.main(SparkTelemetryReceiverFromKafkaStream.java:84)

更新 我正在使用的版本是:

  • scala - 2.11
  • spark-streaming-kafka- 1.4.1
  • spark - 1.4.1

任何人都可以解决这个问题吗?提前谢谢。

2 个答案:

答案 0 :(得分:0)

您的Scala版本错误。你需要2.10.x每

https://spark.apache.org/docs/1.4.1/

&#34;对于Scala API,Spark 1.4.1使用Scala 2.10。&#34;

答案 1 :(得分:0)

可能要迟到才能帮助OP,但是当使用带有spark的kafka流时,你需要确保使用正确的jar文件。

例如,在我的情况下,我有scala 2.11(我正在使用的spark 2.0所需的最小值),并且鉴于kafka spark需要版本2.0.0,我必须使用工件spark-streaming-kafka-0-8-assembly_2.11-2.0.0-preview.jar

请注意我的scala版本和工件版本可以在2.11-2.0.0

看到

希望这有助于(某人)

希望有所帮助。