火花流rdd到json输出没有弃用

时间:2018-04-09 20:38:49

标签: json apache-spark spark-dataframe spark-streaming rdd

下面的代码片段:df5 dataframe正确打印json但是isStream也是假的,这也是在spark 2.2.0中弃用的,所以我在最后两行代码中尝试了另一种方法,但它无法正确读取json,任何建议??

val unionStreams = ssc.union(kinesisStreams)
unionStreams.foreachRDD ((rdd: RDD[Array[Byte]], time: Time) => {
  val rowRDD = rdd.map(jstr => new String(jstr))
  val schema = StructType(StructField("clientTime",StringType,nullable= true) :: StructField("clientIPAddress",  StringType,nullable = true) :: Nil)

  val df5 = sqlContext.read.schema(schema).json(rowRDD)
  println(df5.isStreaming)

  val df6 = spark.readStream.schema(schema).json(rdd.toString())
  println(df6.isStreaming) )}

1 个答案:

答案 0 :(得分:0)

使用Dataset[String]

import sqlContext.implicits._

sqlContext.read.schema(schema).json(rowRDD.toDS)