我是Spark和Kafka的新手,现在就学习它们。我正在尝试将Kafka与Spark集成,我通过eclipse成功执行了该程序。 当我尝试使用Spark提交时,我收到以下错误:
线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / kafka / clients / consumer / Consumer
我的构建内容:
name := "spark_streaming"
version := "0.0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.1",
"org.apache.spark" %% "spark-sql" % "2.1.1",
"org.apache.spark" %% "spark-mllib" % "2.1.1",
"org.apache.spark" %% "spark-hive" % "2.1.1",
"org.apache.spark" %% "spark-streaming" % "2.1.1" % "provided",
"org.apache.kafka" %% "kafka" % "0.11.0.0",
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.1.1",
"org.apache.spark" %% "spark-sql-kafka-0-10" % "2.1.1",
"org.apache.kafka" % "kafka-clients" % "0.11.0.0",
"org.apache.spark" %% "spark-streaming-kafka-assembly" % "1.5.2"
)