Hive sum(column1 * column2)问题

时间:2016-03-10 03:11:53

标签: apache hive hiveql

Hive版本:1.0

select SUM(table.quantity * table.our_price) from table;

此简单查询因此错误而失败,

  

此任务的诊断消息:错误:java.lang.RuntimeException:   org.apache.hadoop.hive.ql.metadata.HiveException:Hive运行时错误   处理行时(tag = 0)[错误获取行数据异常   java.lang.ArrayIndexOutOfBoundsException:1           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:310)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:215)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:199)           在org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)           在org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:353)           在org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:353)           在org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:197)           at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:183)           在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:248)           在org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:455)           在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:397)           在org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:172)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:415)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)           在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)]           在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265)           在org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:455)           在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:397)           在org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:172)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:415)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)           在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)引起:org.apache.hadoop.hive.ql.metadata.HiveException:Hive   处理行时运行时错误(tag = 0)[获取行数据时出错   异常java.lang.ArrayIndexOutOfBoundsException:1           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:310)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:215)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:199)           在org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)           在org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:353)           在org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:353)           在org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:197)           at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:183)           在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:248)           在org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:455)           在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:397)           在org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:172)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:415)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)           在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)]           在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253)           ... 7更多引起:org.apache.hadoop.hive.ql.metadata.HiveException:   java.lang.ArrayIndexOutOfBoundsException:1           在org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:791)           在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244)           ... 7更多引起:java.lang.ArrayIndexOutOfBoundsException:1           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:310)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:215)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)           在org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:199)           在org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)           at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:98)           在org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)           at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)           在org.apache.hadoop.hive.ql.exec.GroupByOperator.updateAggregations(GroupByOperator.java:597)           at org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:888)           在org.apache.hadoop.hive.ql.exec.GroupByOperator.processKey(GroupByOperator.java:718)           在org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:786)           ......还有8个

我从这个错误中得不到多少。

1 个答案:

答案 0 :(得分:0)

我想从" ArrayIndexOutOfBoundsException"你可能有NULL,在table.quantity和table.price中为空,或者总和结果太大了。如果SUM太大,你应该将你的值转换为bigint:

SELECT CAST(SUM(table.quantity * table.our_price) AS bigint) FROM table;