如何避免蜂巢中的交叉连接?

时间:2018-11-07 07:07:24

标签: hive cross-join

我有两个桌子。一个包括100万条记录,另一种包括2000万条记录。


    table 1
    value
    (1, 1)
    (2, 2)
    (3, 3)
    (4, 4)
    (5, 4)
    ....

    table 2
    value
    (55, 11)
    (33, 22)
    (44, 66)
    (22, 11)
    (11, 33)
    ....

我需要使用表1中的值乘以表2中的值,获得结果的排名,并获得排名前5位。 他们的结果将是:


    value from table 1, top 5 for each value in table 1
    (1, 1), 1*44 + 1*66 = 110
    (1, 1), 1*55 + 1*11 = 66
    (1, 1), 1*33 + 1*22 = 55
    (1, 1), 1*11 + 1*33 = 44
    (1, 1), 1*22 + 1* 11 = 33
    .....

我试图在蜂巢中使用交叉联接。但是由于表太大,我总是失败。

2 个答案:

答案 0 :(得分:2)

首先从表2中选择前5个,然后与第一个表进行交叉联接。这与交叉连接两个表相同,并且在交叉连接之后取top5,但是在第一种情况下连接的行数会少得多。具有5行小数据集的交叉联接将转换为map-join,并以与table1全扫描一样快的速度执行。

请看下面的演示。交叉联接已转换为地图联接。请注意计划中的"Map Join Operator"和以下警告:"Warning: Map Join MAPJOIN[19][bigTable=?] in task 'Map 1' is a cross product"

hive> set hive.cbo.enable=true;
hive> set hive.compute.query.using.stats=true;
hive> set hive.execution.engine=tez;
hive> set hive.auto.convert.join.noconditionaltask=false;
hive> set hive.auto.convert.join=true;
hive> set hive.vectorized.execution.enabled=true;
hive> set hive.vectorized.execution.reduce.enabled=true;
hive> set hive.vectorized.execution.mapjoin.native.enabled=true;
hive> set hive.vectorized.execution.mapjoin.native.fast.hashtable.enabled=true;
hive>
    > explain
    > with table1 as (
    > select stack(5,1,2,3,4,5) as id
    > ),
    > table2 as
    > (select t2.id
    >    from (select t2.id, dense_rank() over(order by id desc) rnk
    >            from (select stack(11,55,33,44,22,11,1,2,3,4,5,6) as id) t2
    >         )t2
    >   where t2.rnk<6
    > )
    > select t1.id, t1.id*t2.id
    >   from table1 t1
    >        cross join table2 t2;
Warning: Map Join MAPJOIN[19][bigTable=?] in task 'Map 1' is a cross product
OK
Plan not optimized by CBO.

Vertex dependency in root stage
Map 1 <- Reducer 3 (BROADCAST_EDGE)
Reducer 3 <- Map 2 (SIMPLE_EDGE)

Stage-0
   Fetch Operator
      limit:-1
      Stage-1
         Map 1
         File Output Operator [FS_17]
            compressed:false
            Statistics:Num rows: 1 Data size: 26 Basic stats: COMPLETE Column stats: NONE
            table:{"serde:":"org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe","input format:":"org.apache.hadoop.mapred.TextInputFormat","output format:":"org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat"}
            Select Operator [SEL_16]
               outputColumnNames:["_col0","_col1"]
               Statistics:Num rows: 1 Data size: 26 Basic stats: COMPLETE Column stats: NONE
               Map Join Operator [MAPJOIN_19]
               |  condition map:[{"":"Inner Join 0 to 1"}]
               |  HybridGraceHashJoin:true
               |  keys:{}
               |  outputColumnNames:["_col0","_col1"]
               |  Statistics:Num rows: 1 Data size: 26 Basic stats: COMPLETE Column stats: NONE
               |<-Reducer 3 [BROADCAST_EDGE]
               |  Reduce Output Operator [RS_14]
               |     sort order:
               |     Statistics:Num rows: 1 Data size: 0 Basic stats: PARTIAL Column stats: COMPLETE
               |     value expressions:_col0 (type: int)
               |     Select Operator [SEL_9]
               |        outputColumnNames:["_col0"]
               |        Statistics:Num rows: 1 Data size: 0 Basic stats: PARTIAL Column stats: COMPLETE
               |        Filter Operator [FIL_18]
               |           predicate:(dense_rank_window_0 < 6) (type: boolean)
               |           Statistics:Num rows: 1 Data size: 0 Basic stats: PARTIAL Column stats: COMPLETE
               |           PTF Operator [PTF_8]
               |              Function definitions:[{"Input definition":{"type:":"WINDOWING"}},{"partition by:":"0","name:":"windowingtablefunction","order by:":"_col0(DESC)"}]
               |              Statistics:Num rows: 1 Data size: 0 Basic stats: PARTIAL Column stats: COMPLETE
               |              Select Operator [SEL_7]
               |              |  outputColumnNames:["_col0"]
               |              |  Statistics:Num rows: 1 Data size: 0 Basic stats: PARTIAL Column stats: COMPLETE
               |              |<-Map 2 [SIMPLE_EDGE]
               |                 Reduce Output Operator [RS_6]
               |                    key expressions:0 (type: int), col0 (type: int)
               |                    Map-reduce partition columns:0 (type: int)
               |                    sort order:+-
               |                    Statistics:Num rows: 1 Data size: 48 Basic stats: COMPLETE Column stats: COMPLETE
               |                    UDTF Operator [UDTF_5]
               |                       function name:stack
               |                       Statistics:Num rows: 1 Data size: 48 Basic stats: COMPLETE Column stats: COMPLETE
               |                       Select Operator [SEL_4]
               |                          outputColumnNames:["_col0","_col1","_col2","_col3","_col4","_col5","_col6","_col7","_col8","_col9","_col10","_col11"]
               |                          Statistics:Num rows: 1 Data size: 48 Basic stats: COMPLETE Column stats: COMPLETE
               |                          TableScan [TS_3]
               |                             alias:_dummy_table
               |                             Statistics:Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: COMPLETE
               |<-UDTF Operator [UDTF_2]
                     function name:stack
                     Statistics:Num rows: 1 Data size: 24 Basic stats: COMPLETE Column stats: COMPLETE
                     Select Operator [SEL_1]
                        outputColumnNames:["_col0","_col1","_col2","_col3","_col4","_col5"]
                        Statistics:Num rows: 1 Data size: 24 Basic stats: COMPLETE Column stats: COMPLETE
                        TableScan [TS_0]
                           alias:_dummy_table
                           Statistics:Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: COMPLETE

Time taken: 0.199 seconds, Fetched: 66 row(s)

只需用您的表替换演示中的堆栈。

答案 1 :(得分:0)

./gradlew