将多列映射到Spark数据帧中的单个键

时间:2019-07-15 16:17:34

标签: apache-spark apache-spark-sql

我有一个如下所示的Spark数据框:

+------+-----+-----+
|acctId|vehId|count|
+------+-----+-----+
|     1|  666|    1|
|     1|  777|    3|
|     1|  888|    2|
|     1|  999|    3|
|     2|  777|    1|
|     2|  888|    3|
|     2|  999|    1|
|     3|  777|    4|
|     3|  888|    2|
+------+-----+-----+

我想将每个acctId的vehId映射到其计数,并将其存储回数据框中,因此最终结果如下所示:

+------+---------------------------------------------+
|acctId| map                                         |
+------+---------------------------------------------+
|     1| Map(666 -> 1, 777 -> 3, 888 -> 2, 999 -> 3) |
|     2| Map(777 -> 1, 888 -> 3, 999 -> 1)           |
|     3| Map(777 -> 4, 888 -> 2)                     |
+------+---------------------------------------------+

最好的方法是什么?

我曾尝试将数据帧转换为RDD并在行上执行映射,但是我不确定如何将每个映射聚合回单个acctId。我一般不熟悉Spark和数据帧,但是已经尽力尝试查找类似的问题,如果这是一个非常常见的问题,我们深表歉意。

供我参考/使用,这是我如何生成测试数据的方法:

val testData = Seq(
    (1, 999),
    (1, 999),
    (2, 999),
    (1, 888),
    (2, 888),
    (3, 888),
    (2, 888),
    (2, 888),
    (1, 888),
    (1, 777),
    (1, 666),
    (3, 888),
    (1, 777),
    (3, 777),
    (2, 777),
    (3, 777),
    (3, 777),
    (1, 999),
    (3, 777),
    (1, 777)
).toDF("acctId", "vehId")

val grouped = testData.groupBy("acctId", "vehId").count

1 个答案:

答案 0 :(得分:2)

我认为您必须如下使用双get_ipython().profile_dir.startup_dir

groupBy

输出:

val testData = Seq(
  (1, 999),
  (1, 999),
  (2, 999),
  (1, 888),
  (2, 888),
  (3, 888),
  (2, 888),
  (2, 888),
  (1, 888),
  (1, 777),
  (1, 666),
  (3, 888),
  (1, 777),
  (3, 777),
  (2, 777),
  (3, 777),
  (3, 777),
  (1, 999),
  (3, 777),
  (1, 777)
).toDF("acctId", "vehId")

//udf to convert list to map
val listToMap = udf((input: Seq[Row]) => input.map(row => (row.getAs[Int](0), row.getAs[Long](1))).toMap)

val resultDF = testData.groupBy("acctId", "vehId")
  .agg(count("acctId").cast("long").as("count"))
  .groupBy("acctId")
  .agg(collect_list(struct("vehId", "count")) as ("map"))
  .withColumn("map", listToMap($"map"))

模式:

resultDF.show(false)
+------+----------------------------------------+
|acctId|map                                     |
+------+----------------------------------------+
|1     |[777 -> 3, 666 -> 1, 999 -> 3, 888 -> 2]|
|3     |[777 -> 4, 888 -> 2]                    |
|2     |[777 -> 1, 999 -> 1, 888 -> 3]          |
+------+----------------------------------------+