Spark Dataframes:使用Window PARTITION函数语法时的CASE语句

时间:2018-05-28 07:03:35

标签: scala apache-spark apache-spark-sql

我需要检查条件是否ReasonCode是"是" ,然后使用ProcessDate作为PARTITION列之一,否则不要。

等效的SQL查询如下:

SELECT PNum, SUM(SIAmt) OVER (PARTITION BY PNum,
                                           ReasonCode , 
                                           CASE WHEN ReasonCode = 'YES' THEN ProcessDate ELSE NULL END 
                              ORDER BY ProcessDate RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) SumAmt 
from TABLE1

到目前为止,我已尝试过以下查询,但无法合并条件

"当ReasonCode ='是'那么ProcessDate ELSE NULL END"在Spark Dataframes中

val df = inputDF.select("PNum")
.withColumn("SumAmt", sum("SIAmt").over(Window.partitionBy("PNum","ReasonCode").orderBy("ProcessDate")))

输入数据:

---------------------------------------
Pnum    ReasonCode  ProcessDate SIAmt
---------------------------------------
1       No          1/01/2016   200
1       No          2/01/2016   300
1       Yes         3/01/2016   -200
1       Yes         4/01/2016   200
---------------------------------------

预期产出:

---------------------------------------------
Pnum    ReasonCode  ProcessDate SIAmt  SumAmt
---------------------------------------------
1       No          1/01/2016   200     200 
1       No          2/01/2016   300     500
1       Yes         3/01/2016   -200    -200
1       Yes         4/01/2016   200      200
---------------------------------------------

Spark数据帧的任何建议/帮助而不是spark-sql查询?

1 个答案:

答案 0 :(得分:1)

您可以在api格式中应用与

相同的SQL完全相同的副本
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions._
val df = inputDF
  .withColumn("SumAmt", sum("SIAmt").over(Window.partitionBy(col("PNum"),col("ReasonCode"), when(col("ReasonCode") === "Yes", col("ProcessDate")).otherwise(null)).orderBy("ProcessDate")))

您也可以添加.rowsBetween(Long.MinValue, 0)部分,它应该为您提供

+----+----------+-----------+-----+------+
|Pnum|ReasonCode|ProcessDate|SIAmt|SumAmt|
+----+----------+-----------+-----+------+
|   1|       Yes|  4/01/2016|  200|   200|
|   1|        No|  1/01/2016|  200|   200|
|   1|        No|  2/01/2016|  300|   500|
|   1|       Yes|  3/01/2016| -200|  -200|
+----+----------+-----------+-----+------+