使用Pig on Tez的散列连接性能不佳

时间:2018-05-02 16:17:09

标签: hadoop apache-pig tez

我有一系列Pig脚本,它们正在转换需要连接在一起的多个数据源的数亿条记录。在每个脚本结束时,我达到了JOIN性能变得非常慢的程度。查看Tez视图中的DAG,我发现它被分成相对较少的任务(通常为100-200),但每个任务需要几个小时才能完成。任务描述显示它正在执行HASH_JOIN。

有趣的是,我在Tez执行引擎上运行时只遇到了这个瓶颈。在MapReduce上,它仍然需要一段时间,但没有像我在Tez上获得的令人痛苦的爬行一样。但是,运行MapReduce是一个问题,因为我遇到了MapReduce问题,我已经问了另一个问题here

这是我的代码示例(道歉,我必须使代码非常通用才能在互联网上发布)。我想知道我能做些什么来消除这个瓶颈 - 是否会指定并行性帮助?我的做法有问题吗?

-- Incoming data:
-- A: hundreds of millions of rows, 19 fields
-- B: hundreds of millions of rows, 3 fields
-- C: hundreds of millions of rows, 5 fields
-- D: a few thousand rows, 5 fields

J = -- This reduces the size of A, but still probably in the hundreds of millions
    FILTER A
    BY qualifying == 1;

K = -- This is a one-to-one join that doesn't explode the number of rows in J
    JOIN J BY Id
       , B BY Id;

L =
    FOREACH K
    GENERATE J1 AS L1
           , J2 AS L2
           , J3 AS L3
           , J4 AS L4
           , J5 AS L5
           , J6 AS L6
           , J7 AS L7
           , J8 AS L8
           , B1 AS L9
           , B2 AS L10
           ;

M = -- Reduces the size of C to around one hundred million rows
    FILTER C
    BY Code matches 'Code-.+';

M_WithYear =
    FOREACH M
    GENERATE *
           , (int)REGEX_EXTRACT(Code, 'Code-.+-([0-9]+)', 1) AS year:int
           ;

SPLIT M_WithYear
    INTO M_annual IF year <= (int)'$currentYear' -- roughly 75% of the data from M
       , M_lifetime IF Code == 'Code-Lifetime'; -- roughly 25% of the data from M

-- Transformations for M_annual

N =
    JOIN M_WithYear BY Id, D BY Id USING 'replicated';

O = -- This is where performance falls apart
    JOIN N BY (Id, year, M7) -- M7 matches L7
       , L BY (Id, year, L7);

P =
    FOREACH O
    GENERATE N1 AS P1
           , N2 AS P2
           , N3 AS P3
           , N4 AS P4
           , N5 AS P5
           , N6 AS P6
           , N7 AS P7
           , N8 AS P8
           , N9 AS P9
           , L1 AS P10
           , L2 AS P11
           ;

-- Transformations N-P above repeated for M_lifetime

0 个答案:

没有答案