将Spark数据框列的不同值转换为列表

时间:2019-07-16 10:55:06

标签: scala apache-spark

我有一个看起来像这样的数据集:

+-------+-----+----------+--------------+
| Name  | Age | Pet Name | Phone Number |
+-------+-----+----------+--------------+
| Brett |  14 | Rover    | 123 456 7889 |
| Amy   |  15 | Ginger   | 123 456 8888 |
| Amy   |  15 | Polly    | 123 456 8888 |
| Josh  |  14 | Fido     | 312 456 9999 |
+-------+-----+----------+--------------+

我需要使用Spark以以下格式显示它:

+-------+-----+---------------+--------------+
| Name  | Age |   Pet Name    | Phone Number |
+-------+-----+---------------+--------------+
| Brett |  14 | Rover         | 123 456 7889 |
| Amy   |  15 | Ginger, Polly | 123 456 8888 |
| Josh  |  14 | Fido          | 312 456 9999 |
+-------+-----+---------------+--------------+

有人可以帮我解决这个问题的最佳方法吗?

1 个答案:

答案 0 :(得分:3)

您还可以使用groupBy Name和Age并收集以下Pet Name作为列表

df.groupBy("Name", "Age")
  .agg(collect_list($"Pet Name").as("PetName"), first("Phone Number").as("PhoneNumber")) 

或者您也可以

data.groupBy("Name", "Age", "Phone Number")
  .agg(collect_list($"Pet Name").as("PetName"))

输出:

+-----+---+---------------+------------+
|Name |Age|PetName        |PhoneNumber |
+-----+---+---------------+------------+
|Amy  |15 |[Ginger, Polly]|123 456 8888|
|Brett|14 |[Rover]        |123 456 7889|
|Josh |14 |[Fido]         |312 456 9999|
+-----+---+---------------+------------+

如果您需要字符串,则可以将concat_ws用作

data.groupBy("Name", "Age", "Phone Number")
  .agg(concat_ws(",",collect_list($"Pet Name")).as("PetName"))

输出:

+-----+---+------------+------------+
|Name |Age|Phone Number|PetName     |
+-----+---+------------+------------+
|Brett|14 |123 456 7889|Rover       |
|Amy  |15 |123 456 8888|Ginger,Polly|
|Josh |14 |312 456 9999|Fido        |
+-----+---+------------+------------+