猪没有使用Hcatalog找到Hive Table

时间:2014-12-31 19:55:21

标签: hive apache-pig hcatalog

我使用PIG访问通过HCatalog创建的表batting_data。在这样做时,我面临一个错误,说没有找到提到的表。但是,这个batting_data表可用于HIVE。我也明白,如果没有提到数据库名称,则假定为默认值。

错误org.apache.pig.tools.grunt.Grunt - 错误1115:找不到表:找不到default.batting_data表

  1. 我已经设置了hive-site.xml,如下所示。请注意我没有使用远程服务器用于Metastore,而是使用本地服务器mysql

    <configuration>
    <property>
            <name>javax.jdo.option.ConnectionURL</name>
            <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
            <description>the URL of the MySQL database</description>
    </property>
    
    <property>
            <name>javax.jdo.option.ConnectionDriverName</name>
            <value>com.mysql.jdbc.Driver</value>
    </property>
    
    <property>
            <name>javax.jdo.option.ConnectionUserName</name>
            <value>root</value>
    </property>
    
    <property>
            <name>javax.jdo.option.ConnectionPassword</name>
            <value>root</value>
    </property>
    
    <property>
            <name>hive.hwi.listen.host</name>
            <value>0.0.0.0</value>
    </property>
    <property>
            <name>hive.hwi.listen.port</name>
            <value>9999</value>
    </property>
    <property>
            <name>hive.hwi.war.file</name>
            <value>lib/hive-hwi-0.12.0.war</value>
    </property>
    
    <property>
            <name>hive.metastore.local</name>
            <value>true</value>
    </property>
    

  2. 我在.bashrc中设置了以下内容,用于与HIVE和HCATALOG的PIG集成。

    export PIG_OPTS = -Dhive.metastore.local = true export PIG_CLASSPATH = $ HCAT_HOME / share / hcatalog / :$ HIVE_HOME / lib /

  3. 当PIG启动时,GRUNT shell将默认加载以下语句。

    注册/home/shiva/hive-0.12.0/hcatalog/share/hcatalog/hcatalog-core-0.12.0.jar; 注册/home/shiva/hive-0.12.0/lib/hive-exec-0.12.0.jar; 注册/home/shiva/hive-0.12.0/lib/hive-metastore-0.12.0.jar;


  4. 错误消息的完整日志如下所示。任何帮助解决这个问题将不胜感激。感谢。

    grunt> a = LOAD 'batting_data' USING org.apache.hcatalog.pig.HCatLoader();         
    2015-01-01 01:06:33,849 [main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    2015-01-01 01:06:33,865 [main] INFO  org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called
    2015-01-01 01:06:34,049 [main] INFO  DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored
    2015-01-01 01:06:34,365 [main] WARN  com.jolbox.bonecp.BoneCPConfig - Max Connections < 1. Setting to 20
    2015-01-01 01:06:35,470 [main] INFO  org.apache.hadoop.hive.metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    2015-01-01 01:06:35,501 [main] INFO  org.apache.hadoop.hive.metastore.ObjectStore - Initialized ObjectStore
    2015-01-01 01:06:36,265 [main] WARN  com.jolbox.bonecp.BoneCPConfig - Max Connections < 1. Setting to 20
    2015-01-01 01:06:36,506 [main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_database: NonExistentDatabaseUsedForHealthCheck
    2015-01-01 01:06:36,506 [main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=shiva   ip=unknown-ip-addr  cmd=get_database: NonExistentDatabaseUsedForHealthCheck 
    2015-01-01 01:06:36,512 [main] ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler - NoSuchObjectException(message:There is no database named nonexistentdatabaseusedforhealthcheck)
        at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:431)
        at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:441)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
        at com.sun.proxy.$Proxy6.getDatabase(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStore.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
        at com.sun.proxy.$Proxy7.get_database(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:810)
        at org.apache.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:277)
        at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:147)
        at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:547)
        at org.apache.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:150)
        at org.apache.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:186)
        at org.apache.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:194)
        at org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
        at org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
        at org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:853)
        at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3479)
        at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1536)
        at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1013)
        at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:553)
        at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
        at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:188)
        at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1648)
        at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1621)
        at org.apache.pig.PigServer.registerQuery(PigServer.java:575)
        at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1093)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:501)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
        at org.apache.pig.Main.run(Main.java:541)
        at org.apache.pig.Main.main(Main.java:156)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
    
    2015-01-01 01:06:36,514 [main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_table : db=default tbl=batting_data
    2015-01-01 01:06:36,514 [main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=shiva   ip=unknown-ip-addr  cmd=get_table : db=default tbl=batting_data 
    2015-01-01 01:06:36,516 [main] INFO  DataNucleus.Datastore - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    2015-01-01 01:06:36,516 [main] INFO  DataNucleus.Datastore - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    2015-01-01 01:06:36,795 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1115: Table not found : default.batting_data table not found
    Details at logfile: /home/shiva/pig_1420054544179.log
    

1 个答案:

答案 0 :(得分:2)

好的,这就是..我修好了..

  1. 我没有提到具有正确的HIVE THRIFT服务器地址的PIG_OPTS,因为哪个PIG无法连接到HIVE Metastore并找不到其他表。 已将其更改为PIG_OPTS=-Dhive.metastore.uris=thrift://localhost:10000

  2. 使用

    启动HIVESERVER服务

    $ bin / hive --service hiveserver

  3. 以上解决了这个问题,现在能够将PIG连接到HIVE。 感谢