蜂巢:如何比较WHERE子句中具有复杂数据类型的两列?

时间:2018-09-05 09:43:29

标签: hadoop hive hiveql hadoop2 beeline

我有一个配置单元表作为我的源表。 我还有另外一个作为目标的配置单元表。 源表和目标表的DDL相同,只是在目标表中添加了一些日记记录列。 以下是DDL: 来源:

CREATE EXTERNAL TABLE source.customer_detail(
   id string,
   name string,
   city string,
   properties_owned array<struct<property_addr:string, location:string>>
)
ROW FORMAT SERDE
  'org.apache.hive.hcatalog.data.JsonSerDe'
STORED AS TEXTFILE
LOCATION
  '/user/aiman/customer_detail';

目标:

CREATE EXTERNAL TABLE target.customer_detail(
   id string,
   name string,
   city string,
   properties_owned array<struct<property_addr:string, location:string>>
   audit_insterted_ts timestamp,
   audit_dml_action char(1)
)
PARTITIONED BY (audit_active_flag char(1))
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\u0001'
STORED AS ORC
LOCATION
  '/user/aiman/target/customer_detail';

来源数据:

+---------------------+--------------------------+-------------------------+--------------------------------------------------------------------------------------------------------------------------------------+
| customer_detail.id  |   customer_detail.name   |  customer_detail.city   |                                               customer_detail.properties_owned                                                       |
+---------------------+--------------------------+-------------------------+--------------------------------------------------------------------------------------------------------------------------------------+
| 1                   | Aiman Sarosh             |      kolkata            |  [{"property_addr":"H1 Block Saltlake","location":"kolkata"},{"property_addr":"New Property Added Saltlake","location":"kolkata"}]   |
| 2                   | Justin                   |      delhi              |  [{"property_addr":"some address in delhi","location":"delhi"}]                                                                      |
+---------------------+--------------------------+-------------------------+--------------------------------------------------------------------------------------------------------------------------------------+

目标数据:

+---------------------+--------------------------+-------------------------+------------------------------------------------------------------+--------------------------------------+-----------------------------------+------------------------------------+
| customer_detail.id  |   customer_detail.name   |  customer_detail.city   |              customer_detail.properties_owned                    |  customer_detail.audit_insterted_ts  | customer_detail.audit_dml_action  | customer_detail.audit_active_flag  |
+---------------------+--------------------------+-------------------------+------------------------------------------------------------------+--------------------------------------+-----------------------------------+------------------------------------+
| 1                   | Aiman Sarosh             |      kolkata            |  [{"property_addr":"H1 Block Saltlake","location":"kolkata"}]    | 2018-09-04 06:55:12.361              | I                                 | A                                  |
| 2                   | Justin                   |      delhi              |  [{"property_addr":"some address in delhi","location":"delhi"}]  | 2018-09-05 08:36:39.023              | I                                 | A                                  |
+---------------------+--------------------------+-------------------------+---------------------------------------------------------------------------------------------------------+-----------------------------------+------------------------------------+

当我运行以下查询时,它应该为我获取1条已修改的记录,即:

+---------------------+--------------------------+-------------------------+------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------+-----------------------------------+------------------------------------+
| customer_detail.id  |   customer_detail.name   |  customer_detail.city   |                                                                  customer_detail.properties_owned                                              |  customer_detail.audit_insterted_ts  | customer_detail.audit_dml_action  | customer_detail.audit_active_flag  |
+---------------------+--------------------------+-------------------------+------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------+-----------------------------------+------------------------------------+
| 1                   | Aiman Sarosh             |      kolkata            |  [{"property_addr":"H1 Block Saltlake","location":"kolkata"},{"property_addr":"New Property Added Saltlake","location":"kolkata"}]             | 2018-09-05 07:15:10.321              | U                                 | A                                  |
+---------------------+--------------------------+-------------------------+------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------+-----------------------------------+------------------------------------+

基本上,{"property_addr":"New Property Added Saltlake","location":"kolkata"}元素已添加到properties_owned中记录ID为1的数组列source中。

查询:

SELECT  --fetch modified/updated records in source
   source.id AS id,
   source.name AS name,
   source.city AS city,
   source.properties_owned AS properties_owned,
   current_timestamp() AS audit_insterted_ts,
   'U' AS audit_dml_action,
   'A' AS audit_active_flag
FROM source.customer_detail source
INNER JOIN target.customer_detail jrnl
ON source.id=jrnl.id
WHERE source.name!=jrnl.name
OR source.city!=jrnl.city
OR source.properties_owned!=jrnl.properties_owned

但是它抛出错误:

Error: Error while compiling statement: FAILED: SemanticException [Error 10016]: Line 14:3 Argument type mismatch 'properties_owned': The 1st argument of NOT EQUAL  is expected to a primitive type, but list is found (state=42000,code=10016)

当我使用JOINS时,如何比较WHERE子句中具有复杂数据类型的两列?
我可以使用.POS.ITEM,但这将无济于事,因为我的列是结构数组,并且数组的长度可以不同。

3 个答案:

答案 0 :(得分:1)

处理复杂类型的一种方法是将它们转换为String,例如Json string。有brickhouse项目带有有用的第三方Hive UDF。它具有to_json函数,可以将任何复杂类型转换为json字符串。首先,克隆并构建jar:

git clone https://github.com/klout/brickhouse.git
cd brickhouse
mvn clean package

然后将Brickhouse jar复制到HDFS并将该jar添加到Hive:

add jar hdfs://<your_path>/brickhouse-0.7.1-SNAPSHOT.jar;

在Hive中注册to_json UDF

create temporary function to_json as 'brickhouse.udf.json.ToJsonUDF';

现在您可以使用它了,例如,

hive> select to_json(ARRAY(MAP('a',1), MAP('b',2)));
OK
[{"a":1},{"b":2}]

因此,在您的情况下,您需要将列转换为json字符串,然后在where子句中进行比较。请记住,to_json会原样转换复杂的值。例如,在您的情况下,两个数组

[{"property_addr":"H1 Block Saltlake","location":"kolkata"},{"property_addr":"New Property Added Saltlake","location":"kolkata"}]

[{"property_addr":"New Property Added Saltlake","location":"kolkata"},{"property_addr":"H1 Block Saltlake","location":"kolkata"}]

将有所不同。

答案 1 :(得分:1)

我使用LATERAL VIEW explode()修复了该问题。
然后在爆炸列上结合使用concat_ws()collect_list(array<string>)方法,最终给了我一个string,我进行了比较:

SELECT  --fetch modified/updated records in source
   source.id AS id,
   source.name AS name,
   source.city AS city,
   source.properties_owned AS properties_owned,
   current_timestamp() AS audit_insterted_ts,
   'U' AS audit_dml_action,
   'A' AS audit_active_flag
FROM source.customer_detail source
INNER JOIN target.customer_detail jrnl
ON source.id=jrnl.id
WHERE source.id IN
(
SELECT t1.id
FROM
(
   SELECT src.id,concat_ws(',', collect_list(src.property_addr),collect_list(src.location)) newcol
   FROM
   (
      SELECT id, prop_owned.property_addr AS property_addr, prop_owned.location AS location
      FROM source.customer_detail LATERAL VIEW explode(properties_owned) exploded_tab AS prop_owned
   ) src
   GROUP BY src.id
) t1
INNER JOIN
(
   SELECT trg.id,concat_ws(',', collect_list(trg.property_addr),collect_list(trg.location)) newcol
   FROM
   (
      SELECT id, prop_owned.property_addr AS property_addr, prop_owned.location AS location
      FROM target.customer_detail LATERAL VIEW explode(properties_owned) exploded_tab AS prop_owned
   ) trg
   GROUP BY trg.id
) t2
ON t1.id=t2.id
WHERE t1.newcol!=t2.newcol

希望有人觉得这很有帮助。 :-)

答案 2 :(得分:0)

问题:您正在尝试比较列表而不是原始类型

当前情况:无法使用内置的Hive udfs直接比较复杂对象的列表(有一些字符串列表的解决方法)。

解决方法:您将需要一些第三方UDF来帮助您。有一些有趣的udfs here(我之前没有测试过)