Hcatalog蜂巢问题

时间:2014-10-29 10:24:47

标签: hive hcatalog

我试图通过以下链接执行此hcatalog示例:

http://www.cloudera.com/content/cloudera/en/documentation/cdh4/v4-2-0/CDH4-Installation-Guide/cdh4ig_topic_19_6.html

我在运行这份工作时遇到以下异常。

Exception in thread "main" com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2232)

    at com.google.common.cache.LocalCache.get(LocalCache.java:3965)

    at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)

    at org.apache.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:167)

    at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:143)

    at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:544)

    at org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:103)

    at org.apache.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:85)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:85)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:54)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:46)

    at com.otsi.hcat.UseHCat.run(UseHCat.java:69)

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

    at com.otsi.hcat.UseHCat.main(UseHCat.java:96)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    at java.lang.reflect.Method.invoke(Method.java:606)

    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException
    at java.lang.Class.forName0(Native Method)

    at java.lang.Class.forName(Class.java:270)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1378)

    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:498)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:476)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:524)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:398)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:357)

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)

    at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4948)

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:154)

    at org.apache.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:246)

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:170)

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:167)

    at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)

    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)

    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)

    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)

    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)

    ... 19 more

Caused by: java.lang.ClassNotFoundException: org.antlr.runtime.RecognitionException

在运行MR job i之后执行命令

  

$ export HCAT_HOME = $ HIVE_HOME / hcatalog

     

$ HCATJAR = $ HCAT_HOME / share / hcatalog / hcatalog-core-0.11.0.jar

     

$   HCATPIGJAR = $ HCAT_HOME /共享/ hcatalog /蜂房hcatalog猪适配器-0.13.0.jar

     

$ export   HADOOP_CLASSPATH = $ HCATJAR:$ HCATPIGJAR:$ HIVE_HOME / lib目录/蜂房exec-   0.13.0.jar: $ HIVE_HOME / lib / hive-metastore-0.13.0.jar:$ HIVE_HOME / lib / jdo-api- 3.0.1.jar:$ HIVE_HOME / lib / libfb303-0.9.0.jar:$ HIVE_HOME / lib / libthrift- 0.9.0.jar: $ HIVE_HOME / lib / slf4j-api-1.6.4.jar:$ HIVE_HOME / conf:/usr/hadoop/hadoop-2.4.0/etc/hadoop /

     

$ LIBJARS = echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'

     

$ export LIBJARS = $ LIBJARS,$ HIVE_HOME / lib / antlr-runtime-3.4.jar

3 个答案:

答案 0 :(得分:1)

我没有运行CDH发行版,但是我能够使用以下配置设置来完成这项工作:

export HCAT_HOME=/usr/lib/hive-hcatalog
export HIVE_HOME=/usr/lib/hive
HCATJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-core-0.13.0.2.1.1.0-385.jar
HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.13.0.2.1.1.0-385.jar
HIVE_VERSION=0.13.0.2.1.1.0-385
export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec-$HIVE_VERSION.jar:$HIVE_HOME/lib/hive-metastore-$HIVE_VERSION.jar:$HIVE_HOME/lib/libfb303-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.9.0.jar:$HIVE_HOME/conf:/etc/hadoop/conf
LIBJARS=`echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'`
export LIBJARS=$LIBJARS,$HIVE_HOME/lib/antlr-runtime-3.4.jar

有几点需要注意:

  1. &#34; $ LIBJARS&#34;之间最后一行的逗号和#34; $ HIVE_HOME&#34;是对的。
  2. 我删除了对$ HIVE_HOME / lib / jdo2-api-2.3-ec.jar和$ HIVE_HOME / lib / slf4j-api-1.6.4.jar的引用,因为我在Hadoop发行版中没有这些引用。没有它,代码工作正常。
  3. Hadoop移动得非常快,因此jar版本会发生变化。对于这些设置中的每个引用的jar文件,执行ls -l命令以确保jar文件实际存在于您认为应该存在的位置。
  4. 此代码使用一些已弃用的API调用。我的建议是(至少现在)不要改变代码。我发现尝试更改代码以使用未弃用的版本会破坏代码(另请参阅Radek's update to the same effect)。
  5. 我希望这有帮助!

答案 1 :(得分:0)

确保在类路径中有以下3个数据核罐。

datanucleus-rdbms-3.x.x.jar
datanucleus-core-3.x.x.jar
datanucleus-api-jdo-3.x.x.jar

在HADOOP_CLASSPATH和CLASSPATH上使用'$ HIVE_HOME / conf'总是好的,因为它有关于如何连接到Metastore的重要信息。

答案 2 :(得分:0)

您需要在〜。/ bashrc

中配置环境变量
export SQOOP_HOME=/usr/lib/sqoop
export HBASE_HOME=/usr/local/Hbase
export HIVE_HOME=/usr/local/hive
export HCAT_HOME=/usr/local/hive/hcatalog