按日期/时间计算和汇总数据

时间:2019-02-05 08:14:25

标签: pyspark etl data-warehouse databricks

我正在使用这样的数据框:

Id     | TimeStamp         | Event     |  DeviceId
1      | 5.2.2019 8:00:00  | connect   |  1
2      | 5.2.2019 8:00:05  | disconnect|  1

我正在使用databricks和pyspark进行ETL过程。如何计算和创建底部所示的数据框?我已经尝试过使用UDF,但是找不到找到使其工作的方法。我还尝试通过遍历整个数据帧来做到这一点,但这非常慢。

我想聚合此数据框以得到一个新的数据框,该数据框告诉我每个设备连接和断开连接的时间,时间:

Id     | StartDateTime   | EndDateTime   | EventDuration  |State    |  DeviceId
1      | 5.2.19 8:00:00  | 5.2.19 8:00:05| 0.00:00:05     |connected|  1

1 个答案:

答案 0 :(得分:1)

我认为您可以使用window函数并使用withColumn进行更多的列创建。

我执行的代码应为设备创建映射,并为每个状态创建带有持续时间的表。唯一的要求是交替显示连接和断开连接。

然后您可以使用以下代码:

from pyspark.sql.types import *
from pyspark.sql.functions import *
from pyspark.sql.window import Window
import datetime
test_df = sqlContext.createDataFrame([(1,datetime.datetime(2019,2,5,8),"connect",1),
(2,datetime.datetime(2019,2,5,8,0,5),"disconnect",1),
(3,datetime.datetime(2019,2,5,8,10),"connect",1),
(4,datetime.datetime(2019,2,5,8,20),"disconnect",1),], 
["Id","TimeStamp","Event","DeviceId"])    
#creation of dataframe with 4 events for 1 device
test_df.show()

输出:

+---+-------------------+----------+--------+
| Id|          TimeStamp|     Event|DeviceId|
+---+-------------------+----------+--------+
|  1|2019-02-05 08:00:00|   connect|       1|
|  2|2019-02-05 08:00:05|disconnect|       1|
|  3|2019-02-05 08:10:00|   connect|       1|
|  4|2019-02-05 08:20:00|disconnect|       1|
+---+-------------------+----------+--------+

然后您可以创建辅助函数和窗口:

my_window = Window.partitionBy("DeviceId").orderBy(col("TimeStamp").desc()) #create window
get_prev_time = lag(col("Timestamp"),1).over(my_window)                     #get previous timestamp
time_diff = get_prev_time.cast("long") - col("TimeStamp").cast("long")      #compute duration

test_df.withColumn("EventDuration",time_diff)\
.withColumn("EndDateTime",get_prev_time)\           #apply the helper functions
.withColumnRenamed("TimeStamp","StartDateTime")\    #rename according to your schema
.withColumn("State",when(col("Event")=="connect", "connected").otherwise("disconnected"))\ #create the state column 
.filter(col("EventDuration").isNotNull()).select("Id","StartDateTime","EndDateTime","EventDuration","State","DeviceId").show()
#finally some filtering for the last events, which do not have a previous time

输出:

+---+-------------------+-------------------+-------------+------------+--------+
| Id|      StartDateTime|        EndDateTime|EventDuration|       State|DeviceId|
+---+-------------------+-------------------+-------------+------------+--------+
|  3|2019-02-05 08:10:00|2019-02-05 08:20:00|          600|   connected|       1|
|  2|2019-02-05 08:00:05|2019-02-05 08:10:00|          595|disconnected|       1|
|  1|2019-02-05 08:00:00|2019-02-05 08:00:05|            5|   connected|       1|
+---+-------------------+-------------------+-------------+------------+--------+