Pyspark,合并多个数据框(外部联接),并仅保留一次主键(基于两列/键联接)

时间:2019-07-13 09:52:46

标签: python-3.x merge pyspark outer-join

我有两个数据框 df2

+----------+-------------------+-------------------+--------------+
|Event_Type|              start|                end|agg_sum_15_110|
+----------+-------------------+-------------------+--------------+
|    event1|2016-04-25 05:30:00|2016-05-02 05:30:00|           1.0|
|    event1|2016-05-30 05:30:00|2016-06-06 05:30:00|           1.0|
|    event2|2016-05-02 05:30:00|2016-05-09 05:30:00|           2.0|
|    event2|2016-05-16 05:30:00|2016-05-23 05:30:00|           2.0|
|    event3|2016-05-02 05:30:00|2016-05-09 05:30:00|          11.0|
|    event3|2016-05-23 05:30:00|2016-05-30 05:30:00|           1.0|
+----------+-------------------+-------------------+--------------+

和df3

dftotal = df2.join(df3,((df2.Event_Type == df3.Event_Type) & (df2.start == df3.start )), 'outer'). The above code gives the following output

+----------+-------------------+-------------------+-------------+----------+-------------------+-------------------+--------------+
|Event_Type|              start|                end|agg_sum_10_15|Event_Type|              start|                end|agg_sum_15_110|
+----------+-------------------+-------------------+-------------+----------+-------------------+-------------------+--------------+
|      null|               null|               null|         null|    event3|2016-05-23 05:30:00|2016-05-30 05:30:00|           1.0|
|    event2|2016-05-09 05:30:00|2016-05-16 05:30:00|          1.0|      null|               null|               null|          null|
|    event1|2016-05-09 05:30:00|2016-05-16 05:30:00|          3.0|      null|               null|               null|          null|
|    event3|2016-05-16 05:30:00|2016-05-23 05:30:00|          1.0|      null|               null|               null|          null|
|      null|               null|               null|         null|    event1|2016-05-30 05:30:00|2016-06-06 05:30:00|           1.0|
|      null|               null|               null|         null|    event2|2016-05-02 05:30:00|2016-05-09 05:30:00|           2.0|
|      null|               null|               null|         null|    event3|2016-05-02 05:30:00|2016-05-09 05:30:00|          11.0|
|    event2|2016-06-06 05:30:00|2016-06-13 05:30:00|          1.0|      null|               null|               null|          null|
|    event3|2016-06-13 05:30:00|2016-06-20 05:30:00|          1.0|      null|               null|               null|          null|
|      null|               null|               null|         null|    event2|2016-05-16 05:30:00|2016-05-23 05:30:00|           2.0|
|    event1|2016-06-06 05:30:00|2016-06-13 05:30:00|          3.0|      null|               null|               null|          null|
|    event1|2016-04-25 05:30:00|2016-05-02 05:30:00|          1.0|    event1|2016-04-25 05:30:00|2016-05-02 05:30:00|           1.0|
+----------+-------------------+-------------------+-------------+----------+-------------------+-------------------+--------------+

可能有几个数据帧,要进行匹配的键/列是“ Event_type”和“ start”,当我将它们联接(外部联接)时,列会重复。有没有办法让该列只出现一次,在没有匹配项的地方填充空值

外部联接的目的,每当有匹配项(基于键)时,都应该有单行,并且如果没有匹配项,则会添加其他行(缺失值为空值)

使用以下代码进行加入

{{1}}

我想要一个'Event_type'列。与起始字段类似,第一个“ Event_Type”中的空值从第二个“ Event_type”列中获取值。希望它能解释所需的输出 我在某处读到“ coalesce”命令可能会有所帮助

1 个答案:

答案 0 :(得分:2)

你是对的。 Coalesce是您要搜索的那个。

    >>> from pyspark.sql.functions import *
    >>> dftotal = df2.join(df3,((df2.Event_Type == df3.Event_Type) & (df2.start == df3.start )), 'outer').select(coalesce(df2.Event_Type,df3.Event_Type),coalesce(df2.start,df3.start),df2.end,df2.agg_sum_10_15,df3.end,df3.agg_sum_15_110)
    >>> dftotal.show()
    +--------------------------------+----------------------+-------------------+-------------+-------------------+--------------+
    |coalesce(Event_Type, Event_Type)|coalesce(start, start)|                end|agg_sum_10_15|                end|agg_sum_15_110|
    +--------------------------------+----------------------+-------------------+-------------+-------------------+--------------+
    |                          event1|   2016-05-09 05:30:00|2016-05-16 05:30:00|          3.0|               null|          null|
    |                          event1|   2016-06-06 05:30:00|2016-06-13 05:30:00|          3.0|               null|          null|
    |                          event2|   2016-05-02 05:30:00|               null|         null|2016-05-09 05:30:00|           2.0|
    |                          event3|   2016-05-02 05:30:00|               null|         null|2016-05-09 05:30:00|          11.0|
    |                          event2|   2016-05-16 05:30:00|               null|         null|2016-05-23 05:30:00|           2.0|
    |                          event1|   2016-05-30 05:30:00|               null|         null|2016-06-06 05:30:00|           1.0|
    |                          event3|   2016-05-16 05:30:00|2016-05-23 05:30:00|          1.0|               null|          null|
    |                          event2|   2016-06-06 05:30:00|2016-06-13 05:30:00|          1.0|               null|          null|
    |                          event1|   2016-04-25 05:30:00|2016-05-02 05:30:00|          1.0|2016-05-02 05:30:00|           1.0|
    |                          event3|   2016-06-13 05:30:00|2016-06-20 05:30:00|          1.0|               null|          null|
    |                          event3|   2016-05-23 05:30:00|               null|         null|2016-05-30 05:30:00|           1.0|
    |                          event2|   2016-05-09 05:30:00|2016-05-16 05:30:00|          1.0|               null|          null|
    +--------------------------------+----------------------+-------------------+-------------+-------------------+--------------+