Pyspark以星期格式显示日期值,并带有星期开始日期和结束日期

时间:2020-07-20 12:43:21

标签: python dataframe apache-spark pyspark apache-spark-sql

我有以下似乎很冗长的代码,是否可以使用简化的格式来实现相同的结果。我想要达到的目的是获取一周的开始日期和结束日期,并计算该周的记录。 代码: 创建一个数据框:

new_list = [
  {"inv_dt":"01/01/2020","count":1},
  {"inv_dt":"02/01/2020", "count":2},
  {"inv_dt":"10/01/2020", "count":5},
  {"inv_dt":"11/01/2020","count":1},
  {"inv_dt":"12/01/2020", "count":5},
  {"inv_dt":"20/01/2020", "count":3},
  {"inv_dt":"22/01/2020", "count":2},
  {"inv_dt":"28/01/2020", "count":1}
]
from pyspark.sql import functions as F
from pyspark.sql import Row
df = spark.createDataFrame(Row(**x) for x in new_list)

现在我正在将字符串转换为日期格式:

df = df.withColumn("inv_dt",F.to_date("inv_dt", "dd/MM/yyyy"))
df.show()
+----------+-----+
|    inv_dt|count|
+----------+-----+
|2020-01-01|    1|
|2020-01-02|    2|
|2020-01-10|    5|
|2020-01-11|    1|
|2020-01-12|    5|
|2020-01-20|    3|
|2020-01-22|    2|
|2020-01-28|    1|
+----------+-----+

获取一年中的星期

df = df.withColumn('week_of_year',F.weekofyear(df.inv_dt))
df.show()

+----------+-----+------------+
|    inv_dt|count|week_of_year|
+----------+-----+------------+
|2020-01-01|    1|           1|
|2020-01-02|    2|           1|
|2020-01-10|    5|           2|
|2020-01-11|    1|           2|
|2020-01-12|    5|           2|
|2020-01-20|    3|           4|
|2020-01-22|    2|           4|
|2020-01-28|    1|           5|
+----------+-----+------------+

使用selectExpr来获取一周的开始和结束时间, 以Week_Period的身份加入开始和结束, 然后分组以获取每周的计数

df = df.withColumn('day_of_week', F.dayofweek(F.col('inv_dt')))
df = df.selectExpr('*', 'date_sub(inv_dt, day_of_week-1) as week_start')
df = df.selectExpr('*', 'date_add(inv_dt, 7-day_of_week) as week_end')
df = df.withColumn('Week_Period', F.concat(F.col('week_start'),F.lit(' - '), F.col('week_end')))
list_of_columns = ['week_of_year','Week_Period']
df = df.groupby([F.col(x) for x in list_of_columns]).agg(F.sum(F.col('count')).alias('count'))
df.sort(df.week_of_year).show()

+------------+--------------------+-----+
|week_of_year|         Week_Period|count|
+------------+--------------------+-----+
|           1|2019-12-29 - 2020...|    3|
|           2|2020-01-05 - 2020...|    6|
|           2|2020-01-12 - 2020...|    5|
|           4|2020-01-19 - 2020...|    5|
|           5|2020-01-26 - 2020...|    1|
+------------+--------------------+-----+

1 个答案:

答案 0 :(得分:2)

此代码更干净。

list_of_columns = ['week_of_year','Week_Period']
df\
  .withColumn("day_of_week", F.dayofweek(F.col("inv_dt")))\
  .withColumn("week_end", F.next_day(F.col("inv_dt"), 'Sat'))\
  .withColumn("week_start", F.date_add(F.col("week_end"), -6))\
  .withColumn('Week_Period', F.concat(F.col('week_start'),F.lit(' - '), F.col('week_end')))\
.groupby([F.col(x) for x in list_of_columns]).agg(F.sum(F.col('count')).alias('count'))\
  .sort(df.week_of_year)\
  .show(truncate = False)
+------------+-----------------------+-----+
|week_of_year|Week_Period            |count|
+------------+-----------------------+-----+
|1           |2019-12-29 - 2020-01-04|3    |
|2           |2020-01-05 - 2020-01-11|5    |
|2           |2020-01-12 - 2020-01-18|6    |
|4           |2020-01-19 - 2020-01-25|5    |
|5           |2020-01-26 - 2020-02-01|1    |
+------------+-----------------------+-----+