使用pig或hive将CSV数据加载到HBase

时间:2014-04-29 13:35:36

标签: hadoop hive hbase apache-pig hbasestorage

您好我已经创建了一个将数据加载到hbase中的pig脚本。我的csv文件存储在/hbase_tables/zip.csv

的hadoop位置

Pig Script


register /home/hduser/pig-0.12.0/lib/pig-0.8.0-core.jar;
A = LOAD '/hbase_tables/zip.csv' USING PigStorage(',') as (id:chararray, zip:chararray, desc1:chararray, desc2:chararray, income:chararray);
STORE A INTO 'hbase://mydata' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('zip:zip,desc:desc1,desc:desc2,income:income');

当我执行它时会出现以下错误


猪堆痕迹

ERROR 2017: Internal error creating job configuration.

org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException: ERROR 2017: Internal error creating job configuration.
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:667)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:256)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:147)
        at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:378)
        at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1198)
        at org.apache.pig.PigServer.execute(PigServer.java:1190)
        at org.apache.pig.PigServer.access$100(PigServer.java:128)
        at org.apache.pig.PigServer$Graph.execute(PigServer.java:1517)
        at org.apache.pig.PigServer.executeBatchEx(PigServer.java:362)
        at org.apache.pig.PigServer.executeBatch(PigServer.java:329)
        at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:112)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:169)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:141)
        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:90)
        at org.apache.pig.Main.run(Main.java:510)
        at org.apache.pig.Main.main(Main.java:107)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: hbase://mydata_logs
        at org.apache.hadoop.fs.Path.initialize(Path.java:148)
        at org.apache.hadoop.fs.Path.<init>(Path.java:71)
        at org.apache.hadoop.fs.Path.<init>(Path.java:45)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:470)
        ... 20 more
Caused by: java.net.URISyntaxException: Relative path in absolute URI: hbase://mydata_logs
        at java.net.URI.checkPath(URI.java:1804)
        at java.net.URI.<init>(URI.java:752)
        at org.apache.hadoop.fs.Path.initialize(Path.java:145)
        ... 23 more

请告诉我如何将csv数据文件导入hbase或者如果您有任何替代解决方案。

4 个答案:

答案 0 :(得分:0)

好像你的问题是&#34;相对路径&#34;在绝对URI中:hbase:// mydata_logs。 你确定路径是正确的吗?

答案 1 :(得分:0)

可能表mydata_logs不存在。开始:hbase shell并输入list。您的表mydata_logs在列表中吗?

答案 2 :(得分:0)

尝试将所需的库导入PIG脚本。

Refer the following link which will may help you

答案 3 :(得分:0)

我曾经拥有相同的任务并拥有完整的解决方案(实际上,我对您的第三行代码中的逗号不确定):

%default hbase_home `echo \$HBASE_HOME`;
%default tmp '/user/alexander/tmp/users_dump/k14'

set zookeeper.znode.parent '/hbase-unsecure';
set hbase.zookeeper.quorum 'dmp-hbase.local';

register $hbase_home/lib/zookeeper-3.4.5.jar;
register $hbase_home/hbase-0.94.20.jar;

UsersHdfs = LOAD '$tmp' using PigStorage('\t', '-schema');
store UsersHdfs into 'hbase://user_test' using
    org.apache.pig.backend.hadoop.hbase.HBaseStorage(
        'id:DEFAULT id:last_modified birth:year gender:female gender:male','-caster HBaseBinaryConverter'
);

该代码对我有用,也许问题出在你的hbase配置中。 您可以提供.csv文件,我们可以更详细地讨论它。