Sparksql-使用Sparksql

时间:2019-01-03 17:05:23

标签: java scala apache-spark

我有CSV文件,其中包含下表中的事件。

+-------------------+-------+
|Created            |Name   |
++------------------+-------+
|2018-09-30 21:00:08|EVENT A|
|2018-09-30 21:03:11|Event C|
|2018-09-30 21:04:17|Event 3|
|2018-09-30 21:05:27|Event Y| <<<
|2018-09-30 21:06:11|Event 5|
|2018-09-30 21:07:17|Event P|
|2018-09-30 21:08:25|Event X| <<<
|2018-09-30 21:09:26|Event B|
|2018-09-30 21:10:39|Event O|
-----------------------------

我需要按时间戳划分事件,在Windows中持续5分钟,并在该窗口中搜索事件x的发生,如果发生此事件,我需要在同一窗口中在前一个时间搜索Y事件找到事件x,直到窗口开始。

1 个答案:

答案 0 :(得分:0)

这是一种方法,该方法首先创建5分钟的time windows,然后按时间窗口分区收集事件列表,然后应用udf标记所需事件:

import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp

val df = Seq(
  (Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
  (Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
  (Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
  (Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
  (Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
  (Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
  (Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
  (Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
  (Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")

val winSpec = Window.partitionBy($"Win5m")

def checkEvents(e1: String, e2: String) = udf(
  (currEvent: String, events: Seq[String]) =>
    events.contains(e1) && events.contains(e2) &&
      events.indexOf(e1) < events.indexOf(e2) &&
      (currEvent == e1 || currEvent == e2)
)

df.
  withColumn("Win5m", window($"Created", "5 minutes")).
  withColumn("Events", collect_list($"Name").over(winSpec)).
  withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
  select($"Created", $"Name").
  where($"marked").
  show(false)
// +-------------------+-------+
// |Created            |Name   |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+

以下是上述最终结果中排除了中间列的数据集:

// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created            |Name   |Win5m                                        |Events                                       |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3]                  |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3]                  |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3]                  |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true  |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true  |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O]                                    |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+