我有一个查询,其中列将通用日期时间字段(不是时间戳)转换为基于时区的本地时间。在Oracle中,我能够使用下面的代码片段来完成此操作,但Spark不会允许间隔。我怎么能在Spark.SQL中做到这一点?
case when c.timezone in (4,5) then to_char(b.universal_datetime +
NUMTODSINTERVAL(3, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (8) then to_char(b.universal_datetime,'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (7) then to_char(b.universal_datetime + NUMTODSINTERVAL(1, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (6) then to_char(b.universal_datetime + NUMTODSINTERVAL(2, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (10) then to_char(b.universal_datetime - NUMTODSINTERVAL(3, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (9) then to_char(b.universal_datetime - NUMTODSINTERVAL(1, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
ELSE 'Other' END AS Local_Time,
答案 0 :(得分:0)
SELECT
current_timestamp() AS current_timestamp,
(current_timestamp() - INTERVAL '6' HOUR) AS current_timestamp_minus_six_hours
这不是apache spark sql内置时间函数的一部分。 https://spark.apache.org/docs/2.3.0/api/sql/index.html