将BIGINT转换为TIMESTAMP时,垃圾值即将来临。请参阅下面的查询。 感谢任何帮助。
scala> spark.sql("select cast(cast(cast(CAST('2015-11-15 18:15:06.51' AS TIMESTAMP) as double)*1000 + cast('64082' as double) as bigint) as timestamp) " ).show(truncate=false)
+-----------------------------------------------------------------------------------------------------------------------------------------------+
|CAST(CAST(((CAST(CAST(2015-11-15 18:15:06.51 AS TIMESTAMP) AS DOUBLE) * CAST(1000 AS DOUBLE)) + CAST(64082 AS DOUBLE)) AS BIGINT) AS TIMESTAMP)|
+-----------------------------------------------------------------------------------------------------------------------------------------------+
|47843-07-20 09:36:32.0 |
+-----------------------------------------------------------------------------------------------------------------------------------------------+
答案 0 :(得分:1)
使用Spark 1.6
你的例子似乎意味着你认为将BIGINT投射到TIMESTAMP会从1970-01-01以来的毫秒转换,但事实并非如此。所以你最终得到了一个垃圾值。
请注意,根据此故障单,行为实际上是可配置的:https://issues.apache.org/jira/browse/HIVE-3454
答案 1 :(得分:0)
将bigint转换为时间戳以秒为单位接收纪元时间。 尝试不乘以1000:
select cast(cast(cast(CAST('2015-11-15 18:15:06.51' AS TIMESTAMP) as double) + cast('64082' as double) as bigint) as timestamp)
虽然这会破坏毫米精度。