Spark SQL中的递归CTE

时间:2018-09-28 20:59:16

标签: apache-spark apache-spark-sql databricks spark-notebook

; WITH  Hierarchy as 
        (
            select distinct PersonnelNumber
            , Email
            , ManagerEmail 
            from dimstage
            union all
            select e.PersonnelNumber
            , e.Email           
            , e.ManagerEmail 
            from dimstage  e
            join Hierarchy as  h on e.Email = h.ManagerEmail
        )
        select * from Hierarchy

您能在SPARK SQL中实现同样的效果吗?

2 个答案:

答案 0 :(得分:0)

使用SPARK SQL无法做到这一点。存在WITH子句,但不存在CONNECT BY,例如ORACLE或DB2中的递归。

答案 1 :(得分:0)

这已经很晚了,但是今天我尝试使用PySpark SQL实现cte递归查询。

在这里,我有这个简单的数据框。我要做的是找到每个ID的最新ID。

原始数据框:

+-----+-----+
|OldID|NewID|
+-----+-----+
|    1|    2|
|    2|    3|
|    3|    4|
|    4|    5|
|    6|    7|
|    7|    8|
|    9|   10|
+-----+-----+

我想要的结果:

+-----+-----+
|OldID|NewID|
+-----+-----+
|    1|    5|
|    2|    5|
|    3|    5|
|    4|    5|
|    6|    8|
|    7|    8|
|    9|   10|
+-----+-----+

这是我的代码:

df = sqlContext.createDataFrame([(1, 2), (2, 3), (3, 4), (4, 5), (6, 7), (7, 8),(9, 10)], "OldID integer,NewID integer").checkpoint().cache()

dfcheck = df.drop('NewID')
dfdistinctID = df.select('NewID').distinct()
dfidfinal = dfdistinctID.join(dfcheck, [dfcheck.OldID == dfdistinctID.NewID], how="left_anti") #We find the IDs that have not been replaced

dfcurrent = df.join(dfidfinal, [dfidfinal.NewID == df.NewID], how="left_semi").checkpoint().cache() #We find the the rows that are related to the IDs that have not been replaced, then assign them to the dfcurrent dataframe.
dfresult = dfcurrent
dfdifferentalias = df.select(df.OldID.alias('id1'), df.NewID.alias('id2')).checkpoint().cache()

while dfcurrent.count() > 0:
  dfcurrent = dfcurrent.join(broadcast(dfdifferentalias), [dfcurrent.OldID == dfdifferentalias.id2], how="inner").select(dfdifferentalias.id1.alias('OldID'), dfcurrent.NewID.alias('NewID')).cache()
  dfresult = dfresult.unionAll(dfcurrent)

display(dfresult.orderBy('OldID'))

Databricks notebook screenshot

我知道性能很差,但至少可以提供我需要的答案。

这是我第一次发布对StackOverFlow的答案,因此,如果我有任何错误,请原谅我。