spark-submit错误导致原因:java.lang.ClassNotFoundException:kafka.DefaultSource

时间:2019-06-08 18:54:34

标签: scala apache-spark spark-streaming spark-structured-streaming spark-streaming-kafka

在我的spark程序中,我有以下代码:

connection.once("open"...

这是我的response.css('#info1').extract_first() 文件:

<div id="info1" class="generalized" style="width: auto;">
<strong>Locales:</strong> 
    <span><a href="?w=series&amp;a=AA&amp;a=OC&amp;a=TA&amp;a=MA&amp;g=Chicago" class="discreet">Chicago</a></span>
</div>

如果我从val df = spark.readStream .format("kafka") .option("subscribe", "raw_weather") .option("kafka.bootstrap.servers", "<url:port>s of my brokers") .option("kafka.security.protocol", "SASL_SSL") .option("kafka.sasl.mechanism" , "PLAIN") .option("kafka.sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"username\" password=\"" + "password" + "\";") .option("kafka.ssl.protocol", "TLSv1.2") .option("kafka.ssl.enabled.protocols", "TLSv1.2") .option("kafka.ssl.endpoint.identification.algorithm", "HTTPS") .load() 中删除了build.sbt,则可以在IntelliJ IDEA中成功运行scala代码。

现在我做 name := "kafka-streaming" version := "1.0" scalaVersion := "2.11.12" // still want to be able to run in sbt // https://github.com/sbt/sbt-assembly#-provided-configuration run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)) fork in run := true javaOptions in run ++= Seq( "-Dlog4j.debug=true", "-Dlog4j.configuration=log4j.properties") assemblyMergeStrategy in assembly := { case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat case PathList("META-INF", _*) => MergeStrategy.discard case _ => MergeStrategy.first } libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "2.4.0" % "provided", "org.apache.spark" %% "spark-sql" % "2.4.0" % "provided", //If not then this Exception in thread "streaming-job-executor-0" java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset "org.apache.spark" %% "spark-streaming" % "2.4.0" % "provided", "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.4.0" % "provided", "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.0" % "provided" ) ,然后当我尝试使用以下命令在provided中运行同一程序时:

libraryDependencies

(PS。sbt assembly是我做spark-submit后得到的JAR)

我收到此错误:

spark-submit --class com.ibm.kafkasparkintegration.executables.WeatherDataStream hdfs://<some address>:8020/user/clsadmin/consumer-example.jar \
             --packages org.apache.spark:spark-core:2.4.0,org.apache.spark:spark-sql:2.4.0,org.apache.spark:spark-sql-kafka-0-10:2.4.0,org.apache.spark:spark-streaming:2.4.0,org.apache.spark:spark-streaming-kafka-0-10:2.4.0

从错误日志中,consumer-example.jar是指以上代码中编写的sbt assembly当此代码在IntelliJ中工作时,我不明白为什么它在Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: kafka. Please find packages at http://spark.apache.org/third-party-projects.html at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635) at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:159) >> at com.ibm.kafkasparkintegration.executables.WeatherDataStream$.getRawDataFrame(WeatherDataStream.scala:73) at com.ibm.kafkasparkintegration.executables.WeatherDataStream$.main(WeatherDataStream.scala:23) at com.ibm.kafkasparkintegration.executables.WeatherDataStream.main(WeatherDataStream.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: kafka.DefaultSource at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618) at scala.util.Try.orElse(Try.scala:84) at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618) ... 14 more 中不起作用?

1 个答案:

答案 0 :(得分:2)

这是由于此行:

"org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.0" % "provided"

“已提供”表示此依赖关系,...由您用来运行已编译jar的任何计算机“已提供”。对我而言,您的计算机似乎未提供kafka,因此请尝试删除“提供的”并重新组装。