将StructType扩展为MapType Spark

时间:2018-01-18 22:22:40

标签: apache-spark apache-spark-sql

在Spark中将structType转换为MapType。

架构:

event: struct (nullable = true)
|    | event_category: string (nullable = true)
|    | event_name: string (nullable = true)
|    | properties: struct (nullable = true)
|    |    | prop1: string (nullable = true)
|    |    | prop2: string (nullable = true)

示例数据:

{ "event": {
     "event_category: "abc",
      "event_name": "click",
      "properties" : {
          "prop1": "prop1Value",
          "prop2": "prop2Value",
          ....
      }
   } 
}

需要值为:

event_category | event_name | properties_key | properties_value | 
abc            | click      | prop1          | prop1Value
abc            | click      | prop2          | prop2Value

1 个答案:

答案 0 :(得分:0)

您必须找到一些机制来创建map struct properties。我已使用udf函数zip 并返回arrays

import org.apache.spark.sql.functions._
def collectUdf = udf((cols: collection.mutable.WrappedArray[String], values: collection.mutable.WrappedArray[String]) => cols.zip(values))

spark 不支持多个生成器,因此您必须将dataframe保存到临时dataframe

val columnsMap = df_json.select($"event.properties.*").columns
val temp = df_json.withColumn("event_properties", explode(collectUdf(lit(columnsMap), array($"event.properties.*"))))

最后一步是将event_properties

分开
temp.select($"event.event_category", $"event.event_name", $"event_properties._1".as("properties_key"), $"event_properties._2".as("properties_value")).show(false)

你应该拥有你想要的东西

+--------------+----------+--------------+----------------+
|event_category|event_name|properties_key|properties_value|
+--------------+----------+--------------+----------------+
|abc           |click     |prop1         |prop1Value      |
|abc           |click     |prop2         |prop2Value      |
+--------------+----------+--------------+----------------+