从Hive插入Hbase时出错

时间:2015-02-18 20:25:24

标签: hadoop hive hbase cloudera-cdh

我正在使用CDH 4.7.1群集。地图似乎完成了100%并且没有减少部分。 我已将以下部分添加到hive-site.xml。实际的错误消息粘贴在本文的最后部分。谢谢。任何帮助表示赞赏。

<property>
    <name>hive.aux.jars.path</name>
    <value> file:///opt/cloudera/parcels/CDH/lib/hbase/hbase.jar, 
        file:///opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.7.1.jar,
        file:///opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hive/lib/zookeeper.jar,
        file:///opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hive/lib/guava-11.0.2.jar
    </value>
</property>

ERROR:

java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HBaseSerDe
Continuing ...
java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
Continuing ...
java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat
Continuing ...
java.lang.NullPointerException
Continuing ...
java.lang.NullPointerException
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:315)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:436)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:392)
at org.apache.hadoop.hive.ql.exec.Operator.initializeOp(Operator.java:377)
at org.apache.hadoop.hive.ql.exec.LimitOperator.initializeOp(LimitOperator.java:41)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:436)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:392)
at org.apache.hadoop.hive.ql.exec.ExtractOperator.initializeOp(ExtractOperator.java:40)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
at org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:150)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:469)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.mapred.Child.main(Child.java:262)


MORE ERROR LOG: FROM REDUCE TASK
2015-02-19 16:20:09,624 WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:469)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    ... 9 more
Caused by: java.lang.RuntimeException: Reduce operator initialization failed
    at org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:157)
    ... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:373)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:436)
    at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:392)
    at org.apache.hadoop.hive.ql.exec.Operator.initializeOp(Operator.java:377)
    at org.apache.hadoop.hive.ql.exec.LimitOperator.initializeOp(LimitOperator.java:41)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:436)
    at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:392)
    at org.apache.hadoop.hive.ql.exec.ExtractOperator.initializeOp(ExtractOperator.java:40)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:360)
    at org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:150)
    ... 14 more
Caused by: java.lang.NullPointerException
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:315)
    ... 25 more

1 个答案:

答案 0 :(得分:1)

HIVE_AUX_JARS_PATH中设置hive-env.sh的值。如果您可以访问hive shell,请执行以下操作:

hive> add jar /opt/cloudera/parcels/CDH/lib/hbase/hbase.jar; 
hive> add jar /opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.7.1.jar;
hive> add jar /opt/cloudera/parcels/CDH-4.7.1 1.cdh4.7.1.p0.47/lib/hive/lib/zookeeper.jar;
hive> add jar /opt/cloudera/parcels/CDH-4.7.1-1.cdh4.7.1.p0.47/lib/hive/lib/guava-11.0.2.jar;

希望这可以帮助你