我在Linux(CentOS)中执行MapReduce作业时遇到以下错误。 我在classpath中添加了所有的jar。数据库名称和表名已在hive数据库中,表中包含一些数据列。然后我也无法从配置单元数据库表中访问数据。
我使用香草版的hadoop工作。 我是否需要通过mysql驱动程序路径,hive的用户名和密码来编辑hive-site.xml文件? 如果是,请告诉我为hive添加用户名和密码的过程。 提前谢谢
murali]# hadoop jar /home/murali/workspace/hadoop/HiveInputForMapper/target/HiveInputForMapper-0.0.1-SNAPSHOT.jar com.cosmonet.HiveInputDriver -libjars $LIBJARS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/murali/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Java HotSpot(TM) Server VM warning: You have loaded library /hadoop/hadoop/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/11/20 18:21:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/11/20 18:21:05 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
14/11/20 18:21:05 INFO metastore.ObjectStore: ObjectStore, initialize called
14/11/20 18:21:05 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
14/11/20 18:21:05 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
14/11/20 18:21:07 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
14/11/20 18:21:07 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
14/11/20 18:21:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
14/11/20 18:21:09 INFO metastore.ObjectStore: Initialized ObjectStore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: Added admin role in metastore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: Added public role in metastore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
14/11/20 18:21:09 INFO metastore.HiveMetaStore: 0: get_databases: NonExistentDatabaseUsedForHealthCheck
14/11/20 18:21:09 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_databases: NonExistentDatabaseUsedForHealthCheck
14/11/20 18:21:09 INFO metastore.HiveMetaStore: 0: get_table : db=bigdata tbl=categories
14/11/20 18:21:09 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=bigdata tbl=categories
Exception in thread "main" java.io.IOException: NoSuchObjectException(message:bigdata.categories table not found)
at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:71)
at com.cosmonet.HiveInputDriver.run(HiveInputDriver.java:27)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.cosmonet.HiveInputDriver.main(HiveInputDriver.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: NoSuchObjectException(message:bigdata.categories table not found)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1560)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
at com.sun.proxy.$Proxy9.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
at org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:191)
at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
... 10 more
14/11/20 18:23:10 INFO metastore.HiveMetaStore: 1: Shutting down the object store...
14/11/20 18:23:10 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Shutting down the object store...
14/11/20 18:23:10 INFO metastore.HiveMetaStore: 1: Metastore shutdown complete.
14/11/20 18:23:10 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Metastore shutdown complete
&#13;
答案 0 :(得分:1)
似乎您的Hadoop安装包含slf4j的多个绑定,删除其中一个绑定可能会解决问题。 在导致冲突的依赖项中添加以下排除项。
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
答案 1 :(得分:1)
我猜框架无法找到正确的蜂巢式Metastore。 尝试像这样提供hive conf文件。 hadoop --config $ HIVE_HOME / conf