您将如何在窗口上生成新的数组列?

时间:2018-12-20 11:15:41

标签: pyspark pyspark-sql

我正在尝试生成一个新列,该新列是窗口上方的数组,但是看来数组函数无法在窗口上方运行,并且我正在努力寻找另一种方法。

代码段:

df = df.withColumn('array_output', F.array(df.things_to_agg_in_array).over(Window.partitionBy("aggregate_over_this")))

理想情况下,我想要的是如下表所示的输出:

+---------------------+------------------------+--------------+
| Aggregate Over This | Things to Agg in Array | Array Output |
+---------------------+------------------------+--------------+
| 1                   | C                      | [C,F,K,L]    |
+---------------------+------------------------+--------------+
| 1                   | F                      | [C,F,K,L]    |
+---------------------+------------------------+--------------+
| 1                   | K                      | [C,F,K,L]    |
+---------------------+------------------------+--------------+
| 1                   | L                      | [C,F,K,L]    |
+---------------------+------------------------+--------------+
| 2                   | A                      | [A,B,C]      |
+---------------------+------------------------+--------------+
| 2                   | B                      | [A,B,C]      |
+---------------------+------------------------+--------------+
| 2                   | C                      |    [A,B,C]   |
+---------------------+------------------------+--------------+

对于其他情况,这是爆炸的一部分,然后将基于“此之上的聚集”将其重新连接到另一张表,结果仅返回array_ouput的一个实例。

谢谢

1 个答案:

答案 0 :(得分:1)

此解决方案使用了collect_list(),不确定是否满足您的要求。

myValues = [(1,'C'),(1,'F'),(1,'K'),(1,'L'),(2,'A'),(2,'B'),(2,'C')]
df = sqlContext.createDataFrame(myValues,['Aggregate_Over_This','Things_to_Agg_in_Array'])
df.show()
+-------------------+----------------------+
|Aggregate_Over_This|Things_to_Agg_in_Array|
+-------------------+----------------------+
|                  1|                     C|
|                  1|                     F|
|                  1|                     K|
|                  1|                     L|
|                  2|                     A|
|                  2|                     B|
|                  2|                     C|
+-------------------+----------------------+
df.registerTempTable('table_view')
df1=sqlContext.sql(
    'select Aggregate_Over_This, Things_to_Agg_in_Array, collect_list(Things_to_Agg_in_Array) over (partition by Aggregate_Over_This) as aray_output from table_view'
)
df1.show()
+-------------------+----------------------+------------+
|Aggregate_Over_This|Things_to_Agg_in_Array| aray_output|
+-------------------+----------------------+------------+
|                  1|                     C|[C, F, K, L]|
|                  1|                     F|[C, F, K, L]|
|                  1|                     K|[C, F, K, L]|
|                  1|                     L|[C, F, K, L]|
|                  2|                     A|   [A, B, C]|
|                  2|                     B|   [A, B, C]|
|                  2|                     C|   [A, B, C]|
+-------------------+----------------------+------------+