Kafka到Spark StructuredStreaming在dataframe.WriteStream()上获取异常

时间:2019-08-23 09:16:15

标签: apache-spark apache-kafka spark-streaming-kafka

使用Spark 2.3.0(使用Java)以及Kafka 2.2.0进行结构化流式传输时,出现以下异常:

线程“ main”中的异常com.fasterxml.jackson.databind.JsonMappingException:由于输入结束,没有要映射的内容

请注意:流的IsStreaming()状态为true。

以下代码行中显示以下异常:

df.writeStream().format("console").option("checkpointLocation","E:\\tmp").start();

例外:

Is streaming Status  **************** true
2019-08-23 14:15:46 ERROR StreamMetadata:91 - Error reading stream metadata from file:/E:/tmp/metadata
com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input
 at [Source: java.io.InputStreamReader@1c581a0; line: 1, column: 1]
    at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
    at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3781)
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3721)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2773)
    at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:21)
    at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:50)
    at org.json4s.jackson.Serialization$.read(Serialization.scala:65)
    at org.apache.spark.sql.execution.streaming.StreamMetadata$.read(StreamMetadata.scala:56)
    at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:122)
    at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:49)
    at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:258)
    at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:299)
    at org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:296)
    at SparkInJava.KafkaSparkIntegration.StreamLogic(KafkaSparkIntegration.java:42)
    at SparkInJava.KafkaSparkIntegration.main(KafkaSparkIntegration.java:26)

0 个答案:

没有答案