在hive中执行insert overwrite query时出错

时间:2015-01-26 09:22:40

标签: hadoop hive hbase

我使用hadoop 1.2,hbase 0.94.8和hive 0.14。我正在尝试使用配置单元将数据插入到hbase表中。 我已经创建了表格:

CREATE TABLE hbase_table_emp(id int, name string, role string) 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:name,cf1:role")
TBLPROPERTIES ("hbase.table.name" = "emp");

并将数据加载到另一个表中,我将其覆盖到hbase表中:

hive> create table testemp(id int, name string, role string) row format delimited fields terminated by '\t';
hive> load data local inpath '/home/user/sample.txt' into table testemp;

现在,我试图将其覆盖到hbase表中:

当我这样做时:

hive> insert overwrite table hbase_table_emp select * from testemp;

我收到此错误:

hive> insert overwrite table hbase_table_emp select * from testemp;
Query ID = hduser_20150126005151_ebc2a36f-97c4-41da-b145-32d5732d9681
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
java.lang.NoClassDefFoundError: org/cliffc/high_scale_lib/Counter
    at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:470)
    at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:856)
    at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:544)
    at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)
    at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:370)
    at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:622)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.lang.ClassNotFoundException: org.cliffc.high_scale_lib.Counter
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
    ... 24 more
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. org/cliffc/high_scale_lib/Counter

有人能帮帮我吗?

2 个答案:

答案 0 :(得分:2)

我找到了这个问题的解决方案! 首先我改为hadoop2.4 /hbase0.98 / hive 0.14: 在hive-env.sh我做了:

export HIVE_AUX_JARS_PATH=/usr/local/hbase/lib

并在蜂巢壳中:

hive> add jar /usr/local/hive/lib/hive-hbase-handler-0.14.0.jar;  
hive> add jar /usr/local/hbase/lib/hbase-common-0.98.0-hadoop2.jar;
hive> add jar /usr/local/hbase/lib/zookeeper-3.4.5.jar; 
hive> add jar /usr/local/hbase/lib/guava-12.0.1.jar;
hive> add jar /usr/local/hbase/lib/high-scale-lib-1.1.1.jar; 

这对我来说很有用:)

答案 1 :(得分:0)

听起来好像在运行时Hive无法解析HBase主目录(和关联的库)。您是否可以验证是否设置了以下环境变量(当然,您的实际值可能会有所不同,具体取决于Hive和HBase的安装位置)?

HBASE_IDENT_STRING=hbase
HBASE_CONF_DIR=/etc/hbase/conf
HBASE_HOME=/usr/lib/hbase
HBASE_LOG_DIR=/var/log/hbase
HIVE_HOME=/usr/lib/hive
HBASE_PID_DIR=/var/run/hbase

我不确定是否必须为您正在做的工作定义所有上述环境设置,但我的直觉是至少必须设置HBASE_HOMEHBASE_CONF_DIR

作为参考,上述环境变量是HDP 2.1发行版在Hive运行时为您设置的变量。我能够使用最小的数据集执行Hive-to-HBase持久性,因此希望解决方案在于这些环境设置。