我需要根据某些业务逻辑将少数日期格式转换为布尔条件。
但是我在从Hive调用python脚本时遇到问题。下面是我编写的用于转换样本1列的日期格式的脚本:
import sys
def getYearMonthFromStringDate(dt):
year=0
month=0
try:
ss=dt.split('-')
year=ss[0]
month=ss[1]
except ValueError:
print "Error parsing date string %s" %dt
return int(year)*100+int(month)
for line in sys.stdin:
tempArr=line.split('\t')
accountgl0s=tempArr[0]
agl0 = getYearMonthFromStringDate(accountgl0s)
output_list = [accountgl0s, ag10]
print '\t'.join(output_list)
我使用以下命令在分布式缓存中添加了文件:
add file /folder/date.py
现在,我使用Transform在我的hive表的col accountgl0s
上调用此Python函数,如下所示:
Input column accountgl0s = '2016-10-01'
select transform(accountgl0) using 'python date.py' as (accountgl0s,agl0) from sample;
我的预期输出应为2016-10-01 201610
。但我收到以下错误:
Error: java.lang.RuntimeException: Hive Runtime Error while closing operators
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:217)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error 20003]: An error occurred when trying to close the Operator running your custom script.
at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:557)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
... 8 more
FAILED: Execution Error, return code 20003 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. An error occurred when trying to close the Operator running your custom script.
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
答案 0 :(得分:0)
实际上我需要在python脚本中包含更多的biz逻辑。但出于测试目的,我只是通过转换日期字符串来测试一小段代码。
您是否在此脚本中看到任何问题?
答案 1 :(得分:0)
当您想要计算数字时,您必须将变量类型更改为float:
f_accountgl0s = float(accountgl0s)