我想在1秒的窗口中获取第1000毫秒的数据,需要有关开窗功能的帮助。基本上我需要将数据聚合到第二级,即在1秒的窗口中获取第1000毫秒数据(通过使用窗口/等级/分区功能)
下面是我正在尝试的代码。
val result3DF = spark.sql("select a.* from (select p.*, ROW_NUMBER() OVER (PARTITION BY xxx_ts order by stime DESC) as RN from global_temp.CAN_Table p)a where RN=1")
where xxx_ts is of type yyyy-MM-dd HH:mm:ss and stime is in epoch(ex:1526630451865) which is in milliseconds level.