无法在Spark流应用程序中打印

时间:2017-07-17 18:49:56

标签: spark-streaming apache-spark-2.0

火花串流应用程序没有向driver's stdout打印简单的语句,这里我试图打印一些语句,只是在转换dstream_2之后,但它只是打印了第一批。我希望每批次执行都会打印出来。

val sparkConf = new SparkConf().setMaster("yarn-cluster")
                               .setAppName("SparkJob")
                               .set("spark.executor.memory","2G")
                               .set("spark.dynamicAllocation.executorIdleTimeout","5")


val streamingContext = new StreamingContext(sparkConf, Minutes(1))

var historyRdd: RDD[(String, ArrayList[String])] = streamingContext.sparkContext.emptyRDD

var historyRdd_2: RDD[(String, ArrayList[String])] = streamingContext.sparkContext.emptyRDD


val stream_1 = KafkaUtils.createDirectStream[String, GenericData.Record, StringDecoder, GenericDataRecordDecoder](streamingContext, kafkaParams ,  Set(inputTopic_1))
val dstream_2 = KafkaUtils.createDirectStream[String, GenericData.Record, StringDecoder, GenericDataRecordDecoder](streamingContext, kafkaParams ,  Set(inputTopic_2))


val dstream_2 = stream_2.map((r: Tuple2[String, GenericData.Record]) => 
{
  //some mapping
}
//Not Working
print("Printing Test")
val historyDStream = dstream_1.transform(rdd => rdd.union(historyRdd))
dstream_2.foreachRDD(r => r.repartition(500))
val historyDStream_2 = dstream_2.transform(rdd => rdd.union(historyRdd_2))
val fullJoinResult = historyDStream.fullOuterJoin(historyDStream_2)

 val filtered = fullJoinResult.filter(r => r._2._1.isEmpty)


filtered.foreachRDD{rdd =>

  val formatted = rdd.map(r  => (r._1 , r._2._2.get)) 

  historyRdd_2.unpersist(false) // unpersist the 'old' history RDD
  historyRdd_2 = formatted // assign the new history
  historyRdd_2.persist(StorageLevel.MEMORY_AND_DISK) // cache the computation
}


val filteredStream = fullJoinResult.filter(r => r._2._2.isEmpty)


filteredStream.foreachRDD{rdd =>
  val formatted = rdd.map(r => (r._1 , r._2._1.get)) 
  historyRdd.unpersist(false) // unpersist the 'old' history RDD
  historyRdd = formatted // assign the new history
  historyRdd.persist(StorageLevel.MEMORY_AND_DISK) // cache the computation
}
streamingContext.start()
streamingContext.awaitTermination()

} }

1 个答案:

答案 0 :(得分:1)

该位置的

print("Printing Test")仅在首次评估该程序时打印一次。 要在每个批处理间隔添加一些控制台输出,我们需要将I / O操作放在输出操作的范围内:

每次都会打印出来:

dstream2.foreachRDD{ _ -> print("Printing Test") }