使用临时表时PySpark中的SQL查询错误

时间:2018-09-18 10:18:56

标签: sql sql-server apache-spark-sql pyspark-sql

我有一个SQL查询,必须在PySpark(DataBricks)中进行访问。由于查询复杂,PySpark无法读取相同内容。有人可以检查我的查询并帮助我在不使用“ WITH”语句的情况下以单个“ SELECT”语句编写此查询。

promotions = """
(WITH VCTE_Promotions as 
(SELECT v.Shortname, 
v.Employee_ID_ALT, v.Job_Level, 
v.Management_Level, 
CAST(sysdatetime() AS date) AS PIT_Date, 
v.Employee_Status_Alt as Employee_Status, 
v.Work_Location_Region, v.Work_Location_Country_Desc, v.HML, 
dbo.T_Mngmt_Level_IsManager_Mapping.IsManager
FROM Worker_CUR as v LEFT OUTER JOIN
dbo.T_Mngmt_Level_IsManager_Mapping ON v.Management_Level = dbo.T_Mngmt_Level_IsManager_Mapping.Management_Level),
VCTE_Promotion_v2_Eval as (
SELECT        Employee_ID_ALT,
                             (SELECT MAX(PIT_Date) AS prior_data
                               FROM  dbo.V_Worker_PIT_with_IsManager AS t
                               WHERE (employee_id_alt = a.Employee_ID_ALT) AND (PIT_Date < a.PIT_Date) AND (IsManager <> a.IsManager) OR
                               (employee_id_alt = a.Employee_ID_ALT) AND (PIT_Date < a.PIT_Date) AND (Job_Level <> a.Job_Level)) AS prev_job_change_date, IsManager
FROM            VCTE_Promotions AS a)

SELECT  VCTE_Promotion_v2_Eval.Employee_ID_ALT,
                         COALESCE (v_cur.Employee_Type, N'') AS Curr_Employee_Type,                          
                         v_cur.Review_Rating_Current
FROM VCTE_Promotion_v2_Eval INNER JOIN
[DM_GlobalStaff].[dbo].[V_Worker_CUR] as v_cur ON VCTE_Promotion_v2_Eval.Employee_ID_ALT = v_cur.Employee_ID_ALT LEFT OUTER JOIN
dbo.V_Worker_PIT_with_IsManager as v_m ON VCTE_Promotion_v2_Eval.prev_job_change_date = v_m.PIT_Date AND 
VCTE_Promotion_v2_Eval.Employee_ID_ALT = v_m.employee_id_alt ) as promotions
"""

promotions = spark.read.jdbc(url=jdbcUrl, table=promotions, properties=connectionProperties)

您的及时协助将不胜感激。

0 个答案:

没有答案