Spark SQL窗口平均问题

时间:2018-12-06 13:17:03

标签: scala apache-spark apache-spark-sql

我正在尝试一个月的窗口内的平均函数,但无法获得所需的结果,请在下面使用的代码和数据集下查找。 您能帮我找出我做错了什么吗?

代码:

val df= monthlyFilesDF.groupBy($"COL1", $"COL2",window($"EventTime","1 month").alias("month"))
                     .agg(avg("COL4").alias("avg_COL4"), avg("COL5").alias("avg_COL5"),avg("COL6").alias("avg_COL6"))
                     .withColumn("month", lit($"month").cast(StringType))
                     .withColumn("avg_COL4", lit($"avg_COL5").cast(StringType))
                     .withColumn("avg_COL5", lit($"avg_COL5").cast(StringType))
                     .withColumn("avg_COL6", lit($"avg_COl6").cast(StringType))
                     .show(10,false)

样本数据集:

+------------+--------------+---------------+-----------------+---------------+--------------+---------------+
|COL1        |COL2           |COL3          |EventTime        |COL4           |COL5          |COL6           |
+------------+--------------+---------------+-----------------+---------------+--------------+---------------+
|ServiceCent4 |AP-1-IOO-PPP  |241.206.155.172|06-12-18:17:42:34|162            |53            |1544098354885  |
|ServiceCent1 |AP-1-SPG-QQQ  |178.182.57.167 |06-12-18:17:42:34|110            |30            |1544098354885  |
|ServiceCent4 |AP-1-SPG-DDD  |180.201.249.252|06-12-18:17:42:34|245            |19            |1544098354885  |
|ServiceCent3 |AP-1-SPG-SSS  |210.193.251.211|06-12-18:17:42:34|10             |88            |1544098354885  |
|ServiceCent4 |AP-2-SPG-GGG  |45.25.186.173  |06-12-18:17:42:34|219            |12            |1544098354886  |
|ServiceCent3 |AP-4-SPG-UI   |234.60.84.236  |06-12-18:17:42:34|216            |39            |1544098354886  |
|ServiceCent4 |AP-3-SPG-HUH  |101.244.98.173 |06-12-18:17:42:34|112            |26            |1544098354886  |
|ServiceCent4 |AP-4-SPG-GVF  |203.169.206.12 |06-12-18:17:42:34|115            |40            |1544098354886  |
|ServiceCent4 |AP-0-SPG-JOD  |156.158.45.6   |06-12-18:17:42:34|156            |76            |1544098354886  |
|ServiceCent4 |AP-1-SPG-13   |96.189.94.4    |06-12-18:17:42:34|119            |57            |1544098354886  |
+------------+--------------+---------------+-----------------+---------------+--------------+---------------+

输出

+------------+--------------+-----+------------+-----------------+--------------+
|COL1        |COL2          |month|avg_COL4     |avg_COL5        |       avg_CO6|
+------------+--------------+-----+------------+-----------------+--------------+
+------------+--------------+-----+------------+-----------------+--------------+

1 个答案:

答案 0 :(得分:1)

这里是一个没有窗口的示例,仅使用groupby / agg

val data = Seq(
  Row("ServiceCent4", "AP-1-IOO-PPP", "241.206.155.172", "06-12-18:17:42:34", 162),
  Row("ServiceCent1", "AP-1-SPG-QQQ", "178.182.57.167", "06-12-18:17:42:34", 110 )
)

val schema = List(
  StructField("COL1", StringType, true),
  StructField("COL2", StringType, true),
  StructField("COL3", StringType, true),
  StructField("EventTimeString", StringType, true),
  StructField("COL4", IntegerType, true)
)


val df = spark.createDataFrame(
  spark.sparkContext.parallelize(data),
  StructType(schema)
)

/* convert string to timestamp
 * get month and year from timestamp
 * drop timestamp string
 */
val monthDF = df.withColumn("EventTime", to_timestamp($"EventTimeString", "MM-dd-yy:HH:mm:ss")).withColumn("EventYear", year($"EventTime")).withColumn("EventMonth", month($"EventTime")).drop("EventTimeString")

monthDF.groupBy("COL1", "COL2", "EventYear", "EventMonth").agg(avg("COL4")).show()

+------------+------------+---------+----------+---------+
|        COL1|        COL2|EventYear|EventMonth|avg(COL4)|
+------------+------------+---------+----------+---------+
|ServiceCent4|AP-1-IOO-PPP|     2018|         6|    162.0|
|ServiceCent1|AP-1-SPG-QQQ|     2018|         6|    110.0|
+------------+------------+---------+----------+---------+