在创建KafkaUtils.createStream()时获得以下异常; 下面是我的spark dependency.same事情正在火花流式传输旧版本1.5.2
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>1.6.2</version>
</dependency>
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:91)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:66)
at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:110)
at org.apache.spark.streaming.kafka.KafkaUtils.createStream(KafkaUtils.scala)
at com.tcs.iux.core.config.RealtimeProcessing.startSpark(RealtimeProcessing.java:78)
at com.tcs.iux.core.processor.StartRealTimeProcessing.main(StartRealTimeProcessing.java:32)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 18 more
答案 0 :(得分:0)
Logging was made private in Spark 2.0
使用2.0.0版本的spark-streaming-kafka_2.11
将其与您的火花版本相匹配。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>2.0.0</version>
</dependency>
答案 1 :(得分:0)
你在Spark版本之间混合,这绝不是一个好主意。您需要为Spark Streaming和它的外部Kafka驱动程序进行对齐。
依赖关系发生了一些变化。现在有对Kafka 0.10.0.x的实验性支持。如果您使用的是Kafka 0.8.2.x,则需要:
<graph-directive class="cls" page="ctrl.graphPlottingPage" value='ctrl.graphData'></graph-directive >
如果你想试用Kafka 0.10.0.x,你需要:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.0.0</version>
</dependency>