我的火花数据框架如下:
+-------+----------+-----+
| Status| date |count|
+-------+----------+-----+
|Success|2019-09-06|23596|
|Failure|2019-09-06| 2494|
|Failure|2019-09-07| 1863|
|Success|2019-09-07|22399|
我正在尝试按日期计算成功/失败的百分比,并在相同的pyspark数据框中添加结果。创建多个中间表/数据框后,我能够按组计算成功率或失败率。如何在不创建新的中间数据帧的情况下使用同一单个数据帧?
预期输出:
+-------+----------+-----+----------------------
| Status| date |count| Percent |
+-------+----------+-----+----------------------
|Success|2019-09-06|23596| =(23596/(23596+2494)*100)
|Failure|2019-09-06| 2494| =(2494/(23596+2494)*100)
|Failure|2019-09-07| 1863| = (1863/(1863 + 22399)*100)
|Success|2019-09-07|22399| = (22399/(1863 + 22399)*100)
答案 0 :(得分:1)
您可以在“日期”列上使用window
来获得相同的日期,然后在此窗口中使用sum
列“计数”:
import pyspark.sql.functions as F
from pyspark.sql.window import Window
window = Window.partitionBy(['date'])
df = df.withColumn('Percent', F.col('count')/F.sum('count').over(window)*100)
df.show()
+-------+-------------------+-----+-----------------+
| Status| date|count| Percent|
+-------+-------------------+-----+-----------------+
|Failure|2019-09-07 00:00:00| 1883|7.754715427065316|
|Success|2019-09-07 00:00:00|22399|92.24528457293468|
|Success|2019-09-06 00:00:00|23596|90.44078190877731|
|Failure|2019-09-06 00:00:00| 2494|9.559218091222691|
+-------+-------------------+-----+-----------------+