spark 2.0.2 ClassNotFoundException:org.apache.kafka.clients.consumer.Consumer

时间:2016-12-20 10:26:45

标签: maven apache-spark apache-kafka classnotfound

下面是我的pom.xml。我用maven阴影打造罐子。我很确定org.apache.kafka.clients.consumer.Consumer包含在我的超级罐中。此外,我已将kafka-clients-0.10.1.0.jar放入spark spark.driver.extraLibraryPath。我还尝试在spark-submit命令中添加--jars选项。但我仍然得到classNotFoundException。

   <dependencies>
            <dependency>
                <groupId>org.scala-lang</groupId>
                <artifactId>scala-reflect</artifactId>
                <version>2.11.8</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.kafka</groupId>
                <artifactId>kafka_2.11</artifactId>
                <version>0.10.1.0</version>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>3.8.1</version>
                <scope>test</scope>
            </dependency>
        </dependencies>

enter image description here

2 个答案:

答案 0 :(得分:0)

我找到了旁路解决方案。将jar添加到SPARK_HOME/jars。 我使用spark-submit命令。尝试添加--jars,--driver-library-path。我确信选项生效。但仍然是classNotFound。 我根据下面列出的驱动程序日志找到旁路解决方案。

enter image description here

答案 1 :(得分:0)

基本上,您需要:

kubectl createrec -f https://raw.githubusercontent.com/../d09.yaml