为什么Spark Streaming' Kafka Direct Stream会抛出" EOFException:从频道&#34读取时收到-1?

时间:2016-11-16 09:51:35

标签: scala apache-kafka spark-streaming

当我尝试使用Kafka Topic using Spark Streaming时,我收到此错误。

我们的群集使用Kerberos身份验证进行保护。

我们需要将它设置为Kafka Params中的Kerberos authentication详细信息。

示例代码:

val topicsSet = "topic".split(",").toSet
    val kafkaParams = Map[String, String]("metadata.broker.list" -> "1.2.3.4:6667")
    val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](
      ssc, kafkaParams, topicsSet)
    val lines = messages.map(_._2)
    lines.foreachRDD( rdd => rdd.foreach { x => println(x) })

群集中的kafka版本为0.9.0.1

依赖关系:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka_2.10</artifactId>
    <version>1.6.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.10</artifactId>
    <version>1.6.1</version>
</dependency>

错误日志:

INFO SimpleConsumer: Reconnect due to socket error: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
Exception in thread "main" org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
    at scala.util.Either.fold(Either.scala:97)
    at org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
    at com.scb.upv.TestSparkStreamingConsumer$delayedInit$body.apply(TestSparkStreamingConsumer.scala:19)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
    at scala.App$class.main(App.scala:71)
    at com.scb.upv.TestSparkStreamingConsumer$.main(TestSparkStreamingConsumer.scala:11)

0 个答案:

没有答案