我正在eclipse中编写一个简单的kafka - spark流代码,使用spark streaming来消费来自kafka broker的消息。下面是代码,当我尝试从eclipse运行代码时收到错误。
我还确保依赖jar已经到位,请帮助摆脱这个错误
对象spark_kafka_streaming {
def main(args: Array[String]) {
val conf = new SparkConf()
.setAppName("The swankiest Spark app ever")
.setMaster("local[*]")
val ssc = new StreamingContext(conf, Seconds(60))
ssc.checkpoint("C:\\keerthi\\software\\eclipse-jee-mars-2-win32- x86_64\\eclipse")
println("Parameters:" + "zkorum:" + "group:" + "topicMap:"+"number of threads:")
val zk = "xxxxxxxx:2181"
val group = "test-consumer-group"
val topics = "my-replicated-topic"
val numThreads = 2
val topicMap = topics.split(",").map((_,numThreads.toInt)).toMap
val lines = KafkaUtils.createStream(ssc,zk,group,topicMap).map(_._2)
val words = lines.flatMap(_.split(" "))
val wordCounts = words.map(x => (x,1L)).count()
println("wordCounts:"+wordCounts)
//wordCounts.print
}
}
例外:
线程中的异常" main" java.lang.NoClassDefFoundError:org / apache / spark / streaming / kafka / KafkaUtils $ 在org.firststream.spark_kakfa.spark_kafka_streaming $ .main(spark_kafka_streaming.scala:30) 在org.firststream.spark_kakfa.spark_kafka_streaming.main(spark_kafka_streaming.scala) 引起:java.lang.ClassNotFoundException:org.apache.spark.streaming.kafka.KafkaUtils $ at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher $ AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ......还有2个
依赖关系:
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.1.1</version>
<scope>compile</scope>
<exclusions>
<exclusion>
<artifactId>jmxri</artifactId>
<groupId>com.sun.jmx</groupId>
</exclusion>
<exclusion>
<artifactId>jms</artifactId>
<groupId>javax.jms</groupId>
</exclusion>
<exclusion>
<artifactId>jmxtools</artifactId>
<groupId>com.sun.jdmk</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.2.0</version>
</dependency>
答案 0 :(得分:1)
我评论了以下依赖项。添加了spark-streaming-kafka_2.10,并通过点击buildpath直接将kafka_2.10-0.8.1.1 jar添加到eclpise中的引用库中 - &gt;配置构建路径 - &gt;外部罐子。这解决了这个问题。
<!-- dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.1.1</version>
<scope>compile</scope>
<exclusions>
<exclusion>
<artifactId>jmxri</artifactId>
<groupId>com.sun.jmx</groupId>
</exclusion>
<exclusion>
<artifactId>jms</artifactId>
<groupId>javax.jms</groupId>
</exclusion>
<exclusion>
<artifactId>jmxtools</artifactId>
<groupId>com.sun.jdmk</groupId>
</exclusion>
</exclusions>
</dependency> -->
<!--<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>-->
<!-- <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.2.0</version>
</dependency>-->