在Spark中将structType转换为MapType。
架构:
event: struct (nullable = true)
| | event_category: string (nullable = true)
| | event_name: string (nullable = true)
| | properties: struct (nullable = true)
| | | prop1: string (nullable = true)
| | | prop2: string (nullable = true)
示例数据:
{ "event": {
"event_category: "abc",
"event_name": "click",
"properties" : {
"prop1": "prop1Value",
"prop2": "prop2Value",
....
}
}
}
需要值为:
event_category | event_name | properties_key | properties_value |
abc | click | prop1 | prop1Value
abc | click | prop2 | prop2Value
答案 0 :(得分:0)
您必须找到一些机制来创建map
struct 的properties
。我已使用udf
函数zip
键和值并返回键的arrays
和值。
import org.apache.spark.sql.functions._
def collectUdf = udf((cols: collection.mutable.WrappedArray[String], values: collection.mutable.WrappedArray[String]) => cols.zip(values))
spark 不支持多个生成器,因此您必须将dataframe
保存到临时dataframe
。
val columnsMap = df_json.select($"event.properties.*").columns
val temp = df_json.withColumn("event_properties", explode(collectUdf(lit(columnsMap), array($"event.properties.*"))))
最后一步是将event_properties
列
temp.select($"event.event_category", $"event.event_name", $"event_properties._1".as("properties_key"), $"event_properties._2".as("properties_value")).show(false)
你应该拥有你想要的东西
+--------------+----------+--------------+----------------+
|event_category|event_name|properties_key|properties_value|
+--------------+----------+--------------+----------------+
|abc |click |prop1 |prop1Value |
|abc |click |prop2 |prop2Value |
+--------------+----------+--------------+----------------+