SUM函数的Pig错误

时间:2015-01-01 20:23:57

标签: hadoop apache-pig hadoop2

我有像 -

这样的数据
store   trn_date    dept_id sale_amt
1       2014-12-14  101     10007655
1       2014-12-14  101     10007654
1       2014-12-14  101     10007544
6       2014-12-14  104     100086544
8       2014-12-14  101     1000000
9       2014-12-14  106     1000000

我想得到sale_amt的总和,为此我正在做

首先我使用以下方法加载数据:

table = LOAD 'table' USING org.apache.hcatalog.pig.HCatLoader();

然后对store,tran_date,dept_id

上的数据进行分组
grp_table = GROUP table BY (store, tran_date, dept_id);

最后尝试使用

获取sale_amt的SUM
grp_gen = FOREACH grp_table GENERATE 
           FLATTEN(group) AS (store, tran_date, dept_id),
           SUM(table.sale_amt) AS tota_sale_amt;

低于错误 -

================================================================================
Pig Stack Trace
---------------
ERROR 2103: Problem doing work on Longs

org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: grouped_all: Local Rearrange[tuple]{tuple}(false) - scope-1317 Operator Key: scope-1317): org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem doing work on Longs
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:289)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POLocalRearrange.getNextTuple(POLocalRearrange.java:263)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.processOnePackageOutput(PigCombiner.java:183)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.reduce(PigCombiner.java:161)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.reduce(PigCombiner.java:51)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
        at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1645)
        at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1611)
        at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
        at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:700)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem doing work on Longs
        at org.apache.pig.builtin.AlgebraicLongMathBase.doTupleWork(AlgebraicLongMathBase.java:84)
        at org.apache.pig.builtin.AlgebraicLongMathBase$Intermediate.exec(AlgebraicLongMathBase.java:108)
        at org.apache.pig.builtin.AlgebraicLongMathBase$Intermediate.exec(AlgebraicLongMathBase.java:102)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:330)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNextTuple(POUserFunc.java:369)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.getNext(PhysicalOperator.java:333)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(POForEach.java:378)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNextTuple(POForEach.java:298)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:281)
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Number
        at org.apache.pig.builtin.AlgebraicLongMathBase.doTupleWork(AlgebraicLongMathBase.java:77)
================================================================================

因为我正在使用HCatalog Loader读取表格并且在hive表格中数据类型是字符串所以我在脚本中也尝试使用转换但仍然得到相同的错误

1 个答案:

答案 0 :(得分:0)

我的系统中没有安装HCatalog,因此尝试使用简单文件,但以下方法和代码将适合您。

1. SUM仅适用于数据类型(int, long, float, double, bigdecimal, biginteger or bytearray cast as double)。它的 sale_amt 列看起来像是字符串,因此您需要在使用long or double函数之前将此列标记为(SUM)。

2.您不应该使用store作为变量,bcoz它是Pig中的保留关键字,因此您必须将此变量重命名为其他名称,否则您将收到错误。我将此变量重命名为“stores”。

示例:

表格

1       2014-12-14      101     10007655
1       2014-12-14      101     10007654
1       2014-12-14      101     10007544
6       2014-12-14      104     100086544
8       2014-12-14      101     1000000
9       2014-12-14      106     1000000

<强> PigScript:

A = LOAD 'table' USING PigStorage() AS (store:chararray,trn_date:chararray,dept_id:chararray,sale_amt:chararray);
B = FOREACH A GENERATE $0 AS stores,trn_date,dept_id,(long)sale_amt; --Renamed the variable store to stores and typecasted the sale_amt to long.
C = GROUP B BY (stores,trn_date,dept_id);
D = FOREACH C GENERATE FLATTEN(group),SUM(B.sale_amt);
DUMP D;

<强>输出:

(1,2014-12-14,101,30022853)
(6,2014-12-14,104,100086544)
(8,2014-12-14,101,1000000)
(9,2014-12-14,106,1000000)