带有附加选项和窗口功能的streamWrite

时间:2019-03-26 17:54:11

标签: pyspark spark-streaming

我正在尝试使用append选项编写writeStream,但出现错误。

代码:

from pyspark.sql import SparkSession
from pyspark.sql.functions import window
from pyspark.sql.functions import col, column, count, when

spark = SparkSession\
        .builder\
        .appName("get_sensor_data")\
        .getOrCreate()
spark.sparkContext.setLogLevel("ERROR")

Sensor = lines.select(lines.value.alias('Sensor'),
        lines.timestamp)

windowedCounts = Sensor.withWatermark('timestamp', '10 seconds').groupBy(
        window(Sensor.timestamp, windowDuration, slideDuration)).\
        agg(count(when(col('Sensor')=="LR1 On",True)).alias('LR1'),\
        count(when(col('Sensor')=="LR2 On",True)).alias('LR2'),\
        count(when(col('Sensor')=="LD On",True)).alias('LD')).\
        orderBy('window')

query = windowedCounts\
        .writeStream\
        .outputMode('append')\
        .format("console")\
        .start()

错误:

Append output mode not supported when there are streaming aggregations on streaming DataFrames/DataSets without watermark

使用append选项的原因是以后另存为CSV文件。 我认为此问题是由窗口函数引起的,但我不知道如何解决。

0 个答案:

没有答案
相关问题