线程“主”中的异常java.lang.ClassCastException:kafka.cluster.BrokerEndPoint无法转换为kafka.cluster.Broker

时间:2018-07-03 14:34:22

标签: scala apache-kafka sandbox

  1. 以下版本解决了我的问题name := "simple-spark-scala" version := "1.0" scalaVersion := "2.11.8" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.1" libraryDependencies += "com.typesafe" % "config" % "1.3.2" libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.1.1" libraryDependencies += "org.apache.commons" % "commons-lang3" % "3.5" libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.1.1"

    这是要运行的代码     spark-submit --class streaming.KafkaStreamingDeptCount --master yarn --jars "/usr/hdp/current/kafka-broker/libs/spark-streaming-kafka-0-8_2.11-2.1.0.jar,/usr/hdp/current/kafka-broker/libs/kafka-clients-0.10.2.1.jar,/usr/hdp/current/kafka-broker/libs/kafka_2.11-0.8.2.1.jar,/usr/hdp/current/kafka-broker/libs/metrics-core-2.2.0.jar" simple-spark-scala_2.11-1.0.jar local[2] sandbox-hdp.hortonworks.com:6667 wskafka_topic

我遇到了同样的问题,整天都感到震惊。我正在使用hdfs 2.7.3 Kafka 0.10.1 spark2 2.2.0在沙盒hortonworks 2.6.4上进行练习。得到差异错误消息像

一样
  1. 线程“主”中的异常java.lang.ClassCastException: kafka.cluster.BrokerEndPoint无法转换为kafka.cluster.Broker

  2. 线程“主”中的异常java.lang.NoClassDefFoundError: org / Apache / spark / streaming / Kafka / KafkaUtils $

  3. 线程“ main” org.apache.spark.SparkException中的异常:
  4. org.apache.spark.SparkException:获取分区元数据时出错 为“测试”。主题是否存在?
  5. 线程“ main”中的异常 java.lang.NoClassDefFoundError:org / apache / Kafka / common / network / Send
  6. 处的java.io.EOFException
  7. org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:83)

    kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:129) INFO SimpleConsumer:由于套接字错误而重新连接:

  8. java.io.EOFException:从通道,套接字读取时收到-1 可能已经关闭。线程“主”列表项中的异常

  9. org.apache.spark.SparkException:java.io.EOFException:收到-1 从通道读取时,套接字可能已关闭。列表项

引起所有错误的主要原因是版本不匹配。 上述版本在build.sbt中更新

0 个答案:

没有答案