java.io.IOException:合并此版本的hadoop不支持的凭据

时间:2015-02-16 16:27:02

标签: hadoop hive hbase

我正在尝试通过在HBase中创建的Hive访问表。

以下命令已成功执行。



hbase(main):032:0> create 'hbasetohive', 'colFamily'
0 row(s) in 1.9540 seconds

hbase(main):033:0> put 'hbasetohive', '1s', 'colFamily:val','1strowval'
0 row(s) in 0.1020 seconds

hbase(main):034:0> scan 'hbasetohive'
ROW                                   COLUMN+CELL                                                                                               
 1s                                   column=colFamily:val, timestamp=1423936170125, value=1strowval                                            
1 row(s) in 0.1170 seconds
-----
hive> CREATE EXTERNAL TABLE hbase_hivetable_k(key string, value string)
    > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
    > WITH SERDEPROPERTIES ("hbase.columns.mapping" = "colFamily:val")
    > TBLPROPERTIES("hbase.table.name" = "hbasetohive");
OK
Time taken: 1.622 seconds
hive> Select * from hbase_hivetable_k;
OK
1s    1strowval
Time taken: 0.184 seconds, Fetched: 1 row(s)




但执行COUNT(*)

时会出错



hive> select count(1) from hbase_hivetable_k;
Query ID = hduser_20150216081212_f47b2faa-be53-4eb3-b8dd-b56990455977
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
java.lang.RuntimeException: java.io.IOException: Merging of credentials not supported in this version of hadoop
	at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:485)
	at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobConf(PlanUtils.java:856)
	at org.apache.hadoop.hive.ql.plan.MapWork.configureJobConf(MapWork.java:540)
	at org.apache.hadoop.hive.ql.plan.MapredWork.configureJobConf(MapredWork.java:68)
	at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:370)
	at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.io.IOException: Merging of credentials not supported in this version of hadoop
	at org.apache.hadoop.hive.shims.Hadoop20SShims.mergeCredentials(Hadoop20SShims.java:527)
	at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureJobConf(HBaseStorageHandler.java:483)
	... 23 more
Job Submission failed with exception 'java.lang.RuntimeException(java.io.IOException: Merging of credentials not supported in this version of hadoop)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
&#13;
&#13;
&#13;

我使用的是Hadoop 1.2.1版,Hive版本0.14.0,HBase版本0.94.8

您能不能让我知道我需要更新哪个版本才能使其正常运行。

问候 - Koushik

1 个答案:

答案 0 :(得分:0)

你应该更新hadoop的版本。使用hadoop 2.4.0,一切都会正常工作