Apache Pig:在另一个上过滤一个元组?

时间:2012-11-16 21:51:41

标签: hadoop apache-pig

我希望根据col2中的条件,并在操纵col2之后,分割出两个元组(或猪中调用的任何元组)来运行Pig脚本,进入另一列,比较两个被操纵的元组并做一个额外的排除。

REGISTER /home/user1/piggybank.jar;

log = LOAD '../user2/hadoop_file.txt' AS (col1, col2);

--log = LIMIT log 1000000;
isnt_filtered = FILTER log BY (NOT col2 == 'Some value');
isnt_generated = FOREACH isnt_filtered GENERATE col2, col1, RANDOM() * 1000000 AS random, com.some.valueManipulation(col1) AS isnt_manipulated;

is_filtered = FILTER log BY (col2 == 'Some value');
is_generated = FOREACH is_filtered GENERATE com.some.calculation(col1) AS is_manipulated;
is_distinct = DISTINCT is_generated;

分裂和操纵是最容易的部分。这是它变得复杂的地方。 。

merge_filtered = FOREACH is_generated {FILTER isnt_generated BY (NOT isnt_manipulated == is_generated.is_manipulated)};

如果我能弄清楚这一行,其余部分就会落实到位。

merge_ordered = ORDER merge_filtered BY random, col2, col1;
merge_limited = LIMIT merge_ordered 400000;

STORE merge_limited into 'file';

以下是I / O的示例:

col1                col2            manipulated
This                qWerty          W
Is                  qweRty          R
An                  qwertY          Y
Example             qwErty          E
Of                  qwerTy          T
Example             Qwerty          Q
Data                qWerty          W


isnt
E
Y


col1                col2
This                qWerty
Is                  qweRty
Of                  qwerTy
Example             Qwerty
Data                qWerty

1 个答案:

答案 0 :(得分:2)

我仍然不确定你需要什么,但我相信你可以用以下(未经测试)重现你的输入和输出:

data = LOAD 'input' AS (col1:chararray, col2:chararray);
exclude = LOAD 'exclude' AS (excl:chararray);

m = FOREACH data GENERATE col1, col2, YourUDF(col2) AS manipulated;
test = COGROUP m BY manipulated, exclude BY excl;

-- Here you can choose IsEmpty or NOT IsEmpty according to whether you want to exclude or include
final = FOREACH (FILTER test BY IsEmpty(exclude)) GENERATE FLATTEN(m);

使用COGROUP,您可以通过分组键对每个关系中的所有元组进行分组。如果来自exclude的元组包为空,则表示排除列表中不存在分组键,因此您使用该键保留m的元组。相反,如果exclude中存在分组键,则该行李将不会为空,并且会过滤掉m中包含该键的元组。