Spark Scala-7天滚动总和

时间:2020-04-07 18:04:08

标签: scala apache-spark

我有一些数据想要计算7天的滚动总和。特定日期的每一行应计为1次出现。我在这里的思考过程是使用类似的东西:

val myWindow = Window.orderBy("Date").rangeBetween(currentRow,days(7))
val myData = df.withColumn("Count",df.count().over(myWindow))

但是rangeBetween不允许days(7),因为从当前日期算起要提前7天。

有什么想法吗?

输入数据:

val df = Seq(
    ("08/04/2013",22),
    ("08/05/2013",24),
    ("08/06/2013",26),
    ("08/07/2013",29),
    ("08/08/2013",24),
    ("08/09/2013",24),
    ("08/10/2013",22),
    ("08/11/2013",24),
    ("08/11/2013",26)
    ).toDF("Date","Code")


+----------+----+
|      Date|Code|
+----------+----+
|08/04/2013|  22|
|08/05/2013|  24|
|08/06/2013|  26|
|08/07/2013|  29|
|08/08/2013|  24|
|08/09/2013|  24|
|08/10/2013|  22|
|08/11/2013|  24|
|08/11/2013|  26|
+----------+----+

预期输出:

+----------+-----------+------+
|      Start|End|Amount|Count |
+----------+-----------+------+
|08/04/2013| 08/10/2013|7     |
|08/05/2013| 08/11/2013|8     |
+----------+-----------+------+

1 个答案:

答案 0 :(得分:0)

在Spark 2.3中,您必须对rangeBetween使用长值。由于一天有86400秒,因此您可以将查询表示为:

val myWindow = Window.orderBy("Date").rangeBetween(0, 7 * 86400)
val myData = df
  .withColumn("Date", to_date($"Date", "MM/dd/yyyy").cast("timestamp").cast("long"))
  .withColumn("Count", count($"*").over(myWindow))
  .withColumn("Date", $"Date".cast("timestamp").cast("date"))