当我在Spark SQL中使用posexplode()
函数时,以下语句会生成“ pos”和“ col”作为默认名称
scala> spark.sql(""" with t1(select to_date('2019-01-01') first_day) select first_day,date_sub(add_months(first_day,1),1) last_day, posexplode(array(5,6,7)) from t1 """).show(false)
+----------+----------+---+---+
|first_day |last_day |pos|col|
+----------+----------+---+---+
|2019-01-01|2019-01-31|0 |5 |
|2019-01-01|2019-01-31|1 |6 |
|2019-01-01|2019-01-31|2 |7 |
+----------+----------+---+---+
覆盖spark.sql中的默认名称的语法是什么?
在数据框中,可以通过给df.explode(select 'arr.as(Seq("arr_val","arr_pos")))
scala> val arr= Array(5,6,7)
arr: Array[Int] = Array(5, 6, 7)
scala> Seq(("dummy")).toDF("x").select(posexplode(lit(arr)).as(Seq("arr_val","arr_pos"))).show(false)
+-------+-------+
|arr_val|arr_pos|
+-------+-------+
|0 |5 |
|1 |6 |
|2 |7 |
+-------+-------+
如何在SQL中获得它?我尝试过
spark.sql(""" with t1(select to_date('2011-01-01') first_day) select first_day,date_sub(add_months(first_day,1),1) last_day, posexplode(array(5,6,7)) as(Seq('p','c')) from t1 """).show(false)
和
spark.sql(""" with t1(select to_date('2011-01-01') first_day) select first_day,date_sub(add_months(first_day,1),1) last_day, posexplode(array(5,6,7)) as(('p','c')) from t1 """).show(false)
但是他们抛出错误。
答案 0 :(得分:1)
您可以使用LATERAL VIEW
:
spark.sql("""
WITH t1 AS (SELECT to_date('2011-01-01') first_day)
SELECT first_day, date_sub(add_months(first_day,1),1) last_day, p, c
FROM t1
LATERAL VIEW posexplode(array(5,6,7)) AS p, c
""").show
+----------+----------+---+---+
| first_day| last_day| p| c|
+----------+----------+---+---+
|2011-01-01|2011-01-31| 0| 5|
|2011-01-01|2011-01-31| 1| 6|
|2011-01-01|2011-01-31| 2| 7|
+----------+----------+---+---+
或别名元组
spark.sql("""
WITH t1 AS (SELECT to_date('2011-01-01') first_day)
SELECT first_day, date_sub(add_months(first_day,1),1) last_day,
posexplode(array(5,6,7)) AS (p, c)
FROM t1
""").show
+----------+----------+---+---+
| first_day| last_day| p| c|
+----------+----------+---+---+
|2011-01-01|2011-01-31| 0| 5|
|2011-01-01|2011-01-31| 1| 6|
|2011-01-01|2011-01-31| 2| 7|
+----------+----------+---+---+
已通过Spark 2.4.0测试。
请注意,别名不是字符串,并且不应用'
或"
引起来。如果必须使用非标准标识符,则应使用反引号,即
WITH t1 AS (SELECT to_date('2011-01-01') first_day)
SELECT first_day, date_sub(add_months(first_day,1),1) last_day,
posexplode(array(5,6,7)) AS (`arr pos`, `arr_value`)
FROM t1