如何在Hive中将Array [Struct [String,String]]列类型转换为数组[Map [String,String]]?

时间:2016-04-26 10:53:44

标签: scala apache-spark hive apache-spark-sql spark-dataframe

我在Hive表中有一列:

列名称:过滤器

数据类型:

 |-- filters: array (nullable = true)
 |    |-- element: struct (containsNull = true)
 |    |    |-- name: string (nullable = true)
 |    |    |-- value: string (nullable = true)

我希望通过相应的名称从此列中获取值。

到目前为止我做了什么:

val sdf: DataFrame = sqlContext.sql("select * from <tablename> where id='12345'")

val sdfFilters = sdf.select("filters").rdd.map(r => r(0).asInstanceOf[Seq[(String,String)]]).collect()

Output: sdfFilters: Array[Seq[(String, String)]] = Array(WrappedArray([filter_RISKFACTOR,OIS.SPD.*], [filter_AGGCODE,IR]), WrappedArray([filter_AGGCODE,IR_]))

注意:转换为Seq,因为无法进行WrappedArray到Map转换。

下一步做什么?

1 个答案:

答案 0 :(得分:1)

I want to get the value from this column by it's corresponding name.

如果您希望通过名称简单可靠地获取所有值,则可以使用 explode 并过滤来展平您的表:

case class Data(name: String, value: String)
case class Filters(filters: Array[Data])

val df = sqlContext.createDataFrame(Seq(Filters(Array(Data("a", "b"), Data("a", "c"))), Filters(Array(Data("b", "c")))))
df.show()
+--------------+
|       filters|
+--------------+
|[[a,b], [a,c]]|
|       [[b,c]]|
+--------------+

df.withColumn("filter", explode($"filters"))
  .select($"filter.name" as "name", $"filter.value" as "value")
  .where($"name" === "a")
  .show()
+----+-----+
|name|value|
+----+-----+
|   a|    b|
|   a|    c|
+----+-----+

您也可以按照自己的方式收集数据:

val flatDf = df.withColumn("filter", explode($"filters")).select($"filter.name" as "name", $"filter.value" as "value")
flatDf.rdd.map(r => Array(r(0), r(1))).collect()
res0: Array[Array[Any]] = Array(Array(a, b), Array(a, c), Array(b, c))
flatDf.rdd.map(r => r(0) -> r(1)).groupByKey().collect() //not  the best idea, if you have many values per key
res1: Array[(Any, Iterable[Any])] = Array((b,CompactBuffer(c)), (a,CompactBuffer(b, c)))

如果你想将array[struct]转换为map[string, string]以便将来保存到某个存储 - 这是不同的故事,这种情况可以通过UDF更好地解决。无论如何,只要可以保持代码的可扩展性,就必须避免使用collect()