sparkSQL函数强制转换(右()),强制转换(左())支持pyspark

时间:2016-10-21 14:14:35

标签: mysql apache-spark pyspark

您好我有sql查询并希望使用spark执行它但不确定这里的确切问题是我的查询,

这里有一些示例文件

 '2_CALL_2700_MMP345768_PLL_20160401_104150' => logfile6

 '2_CALL_2700_MMP345768_RTT_20160401_103931' ==> logfile5



aa = sqlcontext.sql("SELECT cast(right(Logfile6,6) as time) as 'log6_time',cast(right(Logfile5,6) as time) as 'log5_time',right(left(timediff(cast(right(Logfile6,6) as time),cast(right(Logfile5,6) as time)),5),2)*60 + right(timediff(cast(right(Logfile6,6) as time),cast(right(Logfile5,6) as time)),2) as 'MT(sec)' from qq1 where timestamp > '2016-04-01 00:00:00'") 

执行火花抛出错误时,

py4j.protocol.Py4JJavaError: An error occurred while calling o38.sql.

:java.lang.RuntimeException:[1.18]失败:``as''预期但是'('找到

SELECT cast(right(Logfile6,6) as time) as 'log6_time',cast(right(Logfile5,6) as time) as 'log5_time',right(left(timediff(cast(right(Logfile6,6) as time),cast(right(Logfile5,6) as time)),5),2)*60 + right(timediff(cast(right(Logfile6,6) as time),cast(right(Logfile5,6) as time)),2) as 'MT(sec)' from qq1 where timestamp > '2016-04-01 00:00:00' 
             ^

同样的查询将在mysql中返回结果,

'10:41:50', '10:39:31', '139'

OR是否有任何最佳方法来实现此目的,以计算持续时间。

0 个答案:

没有答案