FAILED:元数据错误:MetaException(消息:org.apache.hadoop.hbase.MasterNotRunningException:在HBase + Hive集成中

时间:2013-01-08 16:31:02

标签: hbase hive

使用的组件: - -Ubuntu 12.04,hive(0.9.0),hbase(0.94.3),hadoop(0.20.2) - 单节点,zookeeper-3.4.3,guava-11.0.2,hive-hbase-handler-0.9.0

教程: - https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration

步骤 1)开始hadoop, 2)启动hbase-master 3)启动hiveserver 4)使用--auxpath参数启动hive shell,如教程中所示  (所有工作没有错误 - JobTracker,HRegionServer,HMaster,DataNode,NameNode,SecondaryNameNode,TaskTracker,HQuorumPeer,Jps)

hbase(main):001:0> status
1 servers, 0 dead, 2.0000 average load

在hive shell上,我创建了可以创建表

CREATE TABLE IF NOT EXISTS familia (id_familia INT,fk_veiculo INT,fk_cliente INT,nome STRING) ROW FORMAT delimited fields terminated by ',' STORED AS TEXTFILE;
load data local inpath '/home/trendwise/hive_data/tables_csv/familia.csv' overwrite into table familia;

但是当我在下面查询时,没有显示,只是光标闪烁。我等了很久,然后显示错误

CREATE TABLE hbase_familia_1 (key int, id_familia int, fk_veiculo INT,fk_cliente INT,nome STRING)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES("hbase.columns.mapping" = ":key, cf1:id_familia, cf2:fk_veiculo,cf3:fk_cliente, cf4:nome")
TBLPROPERTIES ("hbase.table.name" = "hbase_familia");

我在hbase shell上做了list,它没有显示我尝试过的任何表格吗?

错误: -

FAILED: Error in metadata: MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException: Retried 10 times
at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:127)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:73)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:147)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:398)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:538)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3305)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1326)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

HBase的-site.xml中

<property>
 <name>hbase.rootdir</name>
 <value>hdfs://localhost:54310/hbase</value>
</property>
<property>
 <name>dfs.replication</name>
 <value>1</value>
</property>
<property>
 <name>hbase.zookeeper.property.dataDir</name>
 <value>/home/trendwise/hadoop/hbase-0.94.3/hbase_dataDir</value>
</property>
<property>
 <name>hbase.zookeeper.property.clientPort</name>
 <value>2222</value>
 <description>Property from ZooKeeper's config zoo.cfg.
 </description>
</property>
<property>
 <name>hbase.zookeeper.quorum</name>
 <value>localhost</value>
 <description></description>
</property>
<property>
 <name>hbase.cluster.distributed</name>
 <value>true</value>
 <description></description>
</property>

3 个答案:

答案 0 :(得分:0)

将/ etc / hosts文件中包含“127.0.1.1”的行更改为“127.0.0.1”并重新启动所有内容。另外,将hbase-site.xml复制到HIVE_HOME / conf目录中。

答案 1 :(得分:0)

Hbase无法启动所有守护进程...所以你可能会在hbase日志中找到线索...看看你是否可以在hbase日志中找到任何内容并报告该错误...希望这有效..

答案 2 :(得分:0)

似乎Hive无法找到所需的jar文件

尝试使用所有

执行命令

hive -auxpath /path-to-/hive-examples.jar;其他由分号分隔的jar文件

然后尝试将hbase conf文件放在hive conf目录中。

并在hive-site.xml或hive-env.sh中定义aux路径并尝试

这些错误主要是由于配置单元无法连接到hadoop或hbase。

并检查所有后台程序是否正在运行hbase,你可以在hbase shell上执行所有操作。