SparkSQL中的日历

时间:2019-04-23 11:57:46

标签: sql date calendar apache-spark-sql pyspark-sql

我想使用此查询创建日历表(它在通用SQL中有效)

SELECT DATEADD(day,t4 * 10000 + t3 * 1000 + t2 * 100 + t1 * 10 + t0,'1970-01-01') AS date_value
FROM
    (SELECT 0 t0 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) t0,
    (SELECT 0 t1 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) t1,
    (SELECT 0 t2 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) t2,
    (SELECT 0 t3 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) t3,
    (SELECT 0 t4 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) t4

但是当我将其迁移到SparkSQL并进行一些修改(date_add函数)时,它总是失败并显示语法错误:在“ SELECT”位置缺少“)” 有什么帮助吗?谢谢

1 个答案:

答案 0 :(得分:0)

这有效:

SELECT 10*t1.t1+t0.t0 id, DATE_ADD('1970-01-01', 10*t1.t1+t0.t0) AS date_value
FROM
    (SELECT 0 t0 UNION SELECT 1 t0 UNION SELECT 2 t0 UNION SELECT 3 t0 UNION SELECT 4 t0 UNION SELECT 5 t0 UNION SELECT 6 t0 UNION SELECT 7 t0 UNION SELECT 8 t0 UNION SELECT 9 t0) t0,
    (SELECT 0 t1 UNION SELECT 1 t1 UNION SELECT 2 t1 UNION SELECT 3 t1 UNION SELECT 4 t1 UNION SELECT 5 t1 UNION SELECT 6 t1 UNION SELECT 7 t1 UNION SELECT 8 t1 UNION SELECT 9 t1) t1

结果:

    +-----+-------------+--+
    | id  | date_value  |
    +-----+-------------+--+
    | 11  | 1970-01-12  |
    | 61  | 1970-03-03  |
    | 31  | 1970-02-01  |
    | 51  | 1970-02-21  |
    | 41  | 1970-02-11  |
    ...
    | 80  | 1970-03-22  |
    | 90  | 1970-04-01  |
    | 70  | 1970-03-12  |
    | 0   | 1970-01-01  |
    +-----+-------------+--+