如何基于pyspark中的条件在dataFrame中合并行

时间:2018-08-29 03:33:40

标签: apache-spark dataframe pyspark apache-spark-sql rdd

我必须处理一个包含应用程序日志(进入和退出)的数据框 数据如下:

create-react-app

包含两个会话的行如何合并以产生

USER | DATETIME        | IN_OUT
---------------------------------
0002  2018/08/28 12:00   IN

0002  2018/08/28 12:20   OUT

0003  2018/08/28 13:00   IN

0003  2018/08/28 14:20   OUT

0003  2018/08/28 15:00   IN

0003  2018/08/28 16:00   OUT

1 个答案:

答案 0 :(得分:0)

如果您可以确保在IN后面始终有OUT事件,则可以使用以下代码(我包括对IN和OUT的检查,但是如果IN和OUT不交替则无法使用)。

from pyspark.sql.window import Window as W
test_df = spark.createDataFrame([
    (2,datetime.datetime(2018,8,28,12,00), "IN"),(2,datetime.datetime(2018,8,28,12,20), "OUT"),(3,datetime.datetime(2018,8,28,13,00), "IN"),(3,datetime.datetime(2018,8,28,14,20), "OUT"),(3,datetime.datetime(2018,8,28,15,00), "IN"),(3,datetime.datetime(2018,8,28,16,00), "OUT")
    ], ("USER", "DATETIME", "IN_OUT")) # creation of Dataframe

w = W.partitionBy("USER").orderBy("DATETIME") #order by datetime and process every user separately
get_in= when((lag("IN_OUT", 1).over(w) == "IN") & (col("IN_OUT")=="OUT"), lag("DATETIME",1).over(w)).otherwise(None) # apply the window and if the previous event was IN preserve the time

test_df.withColumn("DATETIMEIN",get_in.cast("timestamp")).withColumn("DATETIMEOUT",col("DATETIME")).filter((col("DATETIMEIN").isNotNull())).withColumn("SESSIONTIME[Minutes]",(col("DATETIME").cast("long")-col("DATETIMEIN").cast("long"))/60).select("USER","DATETIMEIN", "DATETIMEOUT", "SESSIONTIME[Minutes]").show() #apply the function and compute the difference to previous IN_TIME

结果:

+----+-------------------+-------------------+--------------------+
|USER|         DATETIMEIN|        DATETIMEOUT|SESSIONTIME[Minutes]|
+----+-------------------+-------------------+--------------------+
|   3|2018-08-28 13:00:00|2018-08-28 14:20:00|                80.0|
|   3|2018-08-28 15:00:00|2018-08-28 16:00:00|                60.0|
|   2|2018-08-28 12:00:00|2018-08-28 12:20:00|                20.0|
+----+-------------------+-------------------+--------------------+