如何修复Spark Java中的“ java.lang.ClassNotFoundException:无法找到数据源:kafka”错误

时间:2019-01-28 16:56:10

标签: java apache-kafka apache-spark-sql spark-structured-streaming

我正在编写一个火花批处理作业应用程序,它将从kafka主题读取数据并显示其内容。以下是详细信息:

软件包的POM条目,后跟代码:

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql-kafka-0-10_2.11</artifactId>
        <version>2.3.1</version>
    </dependency>
<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.0</version>
        <!--<scope>provided</scope>-->
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.1.0</version>
        <!--<scope>provided</scope>-->
    </dependency>

Dataset<Row> df = this.sparkSession
            .read()
            .format("kafka")
            .option("kafka.bootstrap.servers", "{SERVER_IP:PORT}")
            .option("key.deserializer", "KafkaAvroDeserializer")
            .option("value.deserializer", "KafkaAvroDeserializer")
            .option("group.id", "test")
            .option("auto.offset.reset", "earliest")
            .option("schema.registry.url", "http://{SERVER_IP:PORT}")
            .option("subscribe", "testTopic")
            .load();
    System.out.println("Data from kafka");
    df.show(10);

我正在根据intellij的想法进行这项工作。 (不使用spark提交) 作业失败,出现以下异常

java.lang.ClassNotFoundException: Failed to find data source: kafka. Please find packages at http://spark.apache.org/third-party-projects.html
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:569)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at jobs.S3UploadPOC.internalProcess(S3UploadPOC.java:49)
at jobs.AbstractCronJob.process(AbstractCronJob.java:58)
at jobs.S3UploadPOC.main(S3UploadPOC.java:74)
Caused by: java.lang.ClassNotFoundException: kafka.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
at scala.util.Try.orElse(Try.scala:84)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:554)

配置时是否缺少任何软件包?

0 个答案:

没有答案