Apache Spark - 转换 - 将行值作为列标题 - Pivot

时间:2018-03-23 05:39:10

标签: apache-spark pivot spark-dataframe

我有一个如下所示的数据集 (身份证明,日期,价格)

 - 1, 2017-01-10, 100
 - 1, 2017-01-11, 110
 - 2, 2017-01-10, 100
 - 2, 2017-01-12, 120

我需要以下结果

pidx/date : 2017-01-10  2017-01-11 2017-01-12
1:           100         110         -
2:           100         -          120

哪些转换将导致上述输出

1 个答案:

答案 0 :(得分:2)

您可以pivot使用groupBy获取输出

import spark.implicits._

//dummy data 
val df = Seq(
  (1, "2017-01-10", 100),
  (1, "2017-01-11", 110),
  (2, "2017-01-10", 100),
  (2, "2017-01-12", 120)
).toDF("id", "date", "price")

//first groupBy id and pivot the date and calculate the sum 
val resultDF = df.groupBy("id").pivot("date").agg(sum("price"))

resultDF.show()

输出:

+---+----------+----------+----------+
|id |2017-01-10|2017-01-11|2017-01-12|
+---+----------+----------+----------+
| 1 |100       |110       |null      |
| 2 |100       |null      |120       |
+---+----------+----------+----------+