java.lang.NoClassDefFoundError:scala / collection / GenTraversableOnce

时间:2017-09-28 21:44:45

标签: scala maven apache-spark dependencies apache-kafka

我正在尝试使用Kafka运行spark流。我在scala 2.11.8上使用Scala版本2.11.8和Spark 2.1.0构建。我知道问题是scala版本不匹配,但所有依赖项都添加了正确的版本(图片附加),但我仍然收到此错误。

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at kafka.utils.Pool.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(Unknown Source)
    at kafka.consumer.SimpleConsumer.<init>(Unknown Source)
    at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:364)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:361)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
    at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:361)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:132)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:119)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:211)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
    at com.forrester.streaming.kafka.App$.main(App.scala:19)
    at com.forrester.streaming.kafka.App.main(App.scala)

Dependecies

依赖

<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.koverse</groupId>
<artifactId>koverse-shaded-deps</artifactId>
<version>${koverse.version}</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>


<dependency>
<groupId>org.scalanlp</groupId>
<artifactId>breeze_2.11</artifactId>
<version>0.11.2</version>
</dependency>

<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.0.5</version>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
<scope>test</scope>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
<version>2.1.0</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>

我对不同版本做了更多分析:

在Scala上构建Spark |卡夫卡罐子|结果

<2.1> 2.1.1.8上的2.1.1 | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar |的工作

<2.1> 2.1.1.8上的2.1.1 | spark-streaming-kafka-0-8-assembly_2.10-2.1.1.jar |错误预期

<2.1> 2.1.1.8上的2.1.1 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |错误预期

<2.1> 2.1.1.8上的2.1.0 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |预期错误

<2.1> 2.1.1.8上的2.1.0 | spark-streaming-kafka-0-8-assembly_2.11-2.1.0.jar | 错误:理想情况下应该通过

<2.1> 2.1.1.8上的2.1.0 | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar |错误预期

<2.1> 2.1.1.8上的2.1.0 | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar |错误预期

错误消息 ClassNotFoundException:scala.collection.GenTraversableOnce $ class

案例1正在运行,但案例5失败,不应该抛出任何错误

0 个答案:

没有答案