SparkSQL时间戳查询失败

时间:2014-11-21 20:15:02

标签: scala timestamp apache-spark apache-spark-sql

我通过Spark将一些日志文件放入sql表中,我的架构如下所示:

|-- timestamp: timestamp (nullable = true) 
|-- c_ip: string (nullable = true) 
|-- cs_username: string (nullable = true) 
|-- s_ip: string (nullable = true) 
|-- s_port: string (nullable = true) 
|-- cs_method: string (nullable = true) 
|-- cs_uri_stem: string (nullable = true) 
|-- cs_query: string (nullable = true) 
|-- sc_status: integer (nullable = false) 
|-- sc_bytes: integer (nullable = false) 
|-- cs_bytes: integer (nullable = false) 
|-- time_taken: integer (nullable = false) 
|-- User_Agent: string (nullable = true) 
|-- Referrer: string (nullable = true) 

你可以注意到我创建了一个时间戳字段,我读过它是由Spark支持的(根据我的理解,日期不会工作)。我很乐意使用像#34;其中timestamp>(2012-10-08 16:10:3​​6.0)"但是当我运行它时,我一直都会遇到错误。 我尝试了以下两种形式的sintax形式: 对于第二个我解析一个字符串,所以我确定我实际上以时间戳格式传递它。 我使用了2个函数:解析 date2timestamp

有关如何处理时间戳值的任何提示?

谢谢!

1)     阶> sqlContext.sql(" SELECT * FROM Logs as l where l.timestamp =(2012-10-08 16:10:3​​6.0)")。collect

java.lang.RuntimeException: [1.55] failure: ``)'' expected but 16 found 

SELECT * FROM Logs as l where l.timestamp=(2012-10-08 16:10:36.0) 
                                                  ^ 

2)      sqlContext.sql(" SELECT * FROM记录为l,其中l.timestamp =" + date2timestamp(formatTime3.parse(" 2012-10-08 16:10:3​​6.0&#34) ;)))。收集

java.lang.RuntimeException: [1.54] failure: ``UNION'' expected but 16 found 

SELECT * FROM Logs as l where l.timestamp=2012-10-08 16:10:36.0 
                                                 ^ 

4 个答案:

答案 0 :(得分:5)

我认为问题首先是时间戳的精确度,而且我传递的代表时间戳的字符串必须被转换为字符串

所以这个查询现在有效:

sqlContext.sql("SELECT * FROM Logs as l where cast(l.timestampLog as String) <= '2012-10-08 16:10:36'")

答案 1 :(得分:4)

你忘记了引号。

尝试使用以下语法:

L.timestamp = '2012-07-16 00:00:00'

或者,尝试

L.timestamp = CAST('2012-07-16 00:00:00' AS TIMESTAMP)

答案 2 :(得分:1)

将时间戳的字符串表示形式转换为时间戳。 cast('2012-10-10 12:00:00'作为时间戳)然后你可以做比较作为时间戳,而不是字符串。而不是:

sqlContext.sql("SELECT * FROM Logs as l where cast(l.timestampLog as String) <= '2012-10-08 16:10:36'")

sqlContext.sql("SELECT * FROM Logs as l where l.timestampLog <= cast('2012-10-08 16:10:36' as timestamp)")

答案 3 :(得分:0)

可悲的是,这对我没有用。我正在使用Apache Spark 1.4.1。以下代码是我的解决方案:

Date date = new Date();

String query = "SELECT * FROM Logs as l where l.timestampLog <= CAST('" + new java.sql.Timestamp(date.getTime()) + "' as TIMESTAMP)";

sqlContext.sql(query);

将timestampLog强制转换为字符串不会抛出任何错误,但不返回任何数据。