从不同的arraytype列中获取元素,并在Spark中构建具有异构数据的列

时间:2017-10-26 15:34:07

标签: scala apache-spark apache-spark-sql

我有一个从XML文件解析的Spark数据帧,该文件包含以下格式的数据:

+---------+------------------------------------------------------------------------------------------+----------------------------+------------------------------------------------+
|id       |a                                                                                         |b                           |c                                               |
+---------+------------------------------------------------------------------------------------------+----------------------------+------------------------------------------------+
|191683250|[52396062, 55064266, 51149167, 53441347, 51309543, 51517728, 51543627, 68138995, 70180065]|[2, 2, 1, 3, 3, 2, 2, 27, 1]|[1.15, 0.8, 4.0, 2.49, 1.0, 2.8, 0.4, 0.49, 2.0]|
+---------+------------------------------------------------------------------------------------------+----------------------------+------------------------------------------------+

我需要格式为

的输出数据
+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|id       |a                                                                                                                                                                          |
+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|191683250|Array[(52396062,2,1.5), (55064266,2,0.8),  (51149167,1,4.0),  (53441347,3,2.49), (51309543,3,1.0), (51517728,2,2.8), (51543627,2,0.4), (68138995,27,0.49), (70180065,1,2.0)]|
+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

即,我需要一个StructTypes /元组数组。 我只是坚持如何继续这个。

请指点我如何使用Scala在Spark中实现这一点。 感谢任何帮助。

2 个答案:

答案 0 :(得分:4)

Spark> = 2.4 中,可以使用arrays_zip函数解决此问题:

val df = // Example dataframe in question
val df2 = df.withColumn("a", arrays_zip($"a", $"b", $"c"))
  .drop("b", "c")

对于旧版本的Spark,请使用UDF

val convertToArray = udf((a: Seq[Int], b: Seq[Int], c: Seq[Double]) => {
  a zip b zip c map { case((a,b),c) => (a,b,c)}
})

val df = // Example dataframe in question
val df2 = df.withColumn("a", convertToArray($"a", $"b", $"c"))
  .drop("b", "c")

结果数据框:

+---------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|id       |a                                                                                                                                                                     |
+---------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|191683250|[[52396062,2,1.15], [55064266,2,0.8], [51149167,1,4.0], [53441347,3,2.49], [51309543,3,1.0], [51517728,2,2.8], [51543627,2,0.4], [68138995,27,0.49], [70180065,1,2.0]]|
+---------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------+

答案 1 :(得分:1)

这个答案并不像@ Shaido的回答那么完美。这个答案只是以另一种方式做的可能性

df.select($"id",
  array(struct($"a"(0), $"b"(0), $"c"(0)),
  struct($"a"(1), $"b"(1), $"c"(1)),
  struct($"a"(2), $"b"(2), $"c"(2)),
  struct($"a"(3), $"b"(3), $"c"(3)),
  struct($"a"(4), $"b"(4), $"c"(4)),
  struct($"a"(5), $"b"(5), $"c"(5)),
  struct($"a"(6), $"b"(6), $"c"(6)),
  struct($"a"(7), $"b"(7), $"c"(7))).as("a"))
.show(false)

你应该得到

+---------+----------------------------------------------------------------------------------------------------------------------------------------------------+
|id       |a                                                                                                                                                   |
+---------+----------------------------------------------------------------------------------------------------------------------------------------------------+
|191683250|[[52396062,2,1.15], [55064266,2,0.8], [51149167,1,4.0], [53441347,3,2.49], [51309543,3,1.0], [51517728,2,2.8], [51543627,2,0.4], [68138995,27,0.49]]|
+---------+----------------------------------------------------------------------------------------------------------------------------------------------------+