在Pyspark中爆炸地图列而不会丢失空值

时间:2018-02-07 14:17:01

标签: apache-spark pyspark spark-dataframe explode

在没有丢失空值的情况下,是否有任何优雅的方法可以在Pyspark 2.2中爆炸地图列? Explode_outer在Pyspark 2.3中引入

受影响列的架构是:

|-- foo: map (nullable = true)
 |    |-- key: string
 |    |-- value: struct (valueContainsNull = true)
 |    |    |-- first: long (nullable = true)
 |    |    |-- last: long (nullable = true)

我想用一些虚拟值替换空Map,以便能够在不丢失空值的情况下爆炸整个数据帧。我尝试过类似的东西,但是我收到了一个错误:

from pyspark.sql.functions import when, size, col
df = spark.read.parquet("path").select(
        when(size(col("foo")) == 0, {"key": [0, 0]}).alias("bar")
    )

错误:

Py4JJavaError: An error occurred while calling z:org.apache.spark.sql.functions.when.
: java.lang.RuntimeException: Unsupported literal type class java.util.HashMap {key=[0, 0]}
    at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:77)
    at org.apache.spark.sql.catalyst.expressions.Literal$$anonfun$create$2.apply(literals.scala:163)
    at org.apache.spark.sql.catalyst.expressions.Literal$$anonfun$create$2.apply(literals.scala:163)
    at scala.util.Try.getOrElse(Try.scala:79)
    at org.apache.spark.sql.catalyst.expressions.Literal$.create(literals.scala:162)
    at org.apache.spark.sql.functions$.typedLit(functions.scala:112)
    at org.apache.spark.sql.functions$.lit(functions.scala:95)
    at org.apache.spark.sql.functions$.when(functions.scala:1256)
    at org.apache.spark.sql.functions.when(functions.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:280)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:748)

1 个答案:

答案 0 :(得分:1)

所以我终于做到了。我用一些虚拟值替换了空映射,然后使用explode并删除原始列。

replace_empty_map = udf(lambda x: {"key": [0, 1]} if len(x) == 0 else x, 
             MapType(StringType(), 
                     StructType(
                         [StructField("first", LongType()), StructField("last", LongType())]
                     )
                    )
            )

df = df.withColumn("foo_replaced",replace_empty_map(df["foo"])).drop("foo")
df = df.select('*', explode('foo_replaced').alias('foo_key', 'foo_val')).drop("foo_replaced")