当我尝试将数据从hdfs复制到输入文件中有两列以上(或字段)的hbase uisng pig时。我收到了错误。详细示例:
HDFS中的文件:/home/1.txt
1 2 3 4
5 6 7 8
具有两个列族的Hbase表:创建'table1','P','S'
猪命令:
A= load '/home/1.txt' using PigStorage('\t') as (one:chararray,two:chararray,three:chararray,four:chararray);
STORE A INTO 'hbase://table1' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('P:one,P:two,S:three,S:four');
错误日志:
*********
013-09-20 15:42:25,314 [main] ERROR org.apache.pig.tools.pigstats.PigStats - ERROR: Index: 1, Size: 1
2013-09-20 15:42:25,315 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2013-09-20 15:42:25,315 [main] INFO org.apache.pig.tools.pigstats.PigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
0.20.2-cdh3u6 0.8.1-cdh3u6 hdfs 2013-09-20 15:41:45 2013-09-20 15:42:25 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_201309051922_0192 A MAP_ONLY Message: Job failed! Error - NA hbase://hh2,
Input(s):
Failed to read data from "/home/1.txt"
Output(s):
Failed to produce result in "hbase://hh2"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_201309051922_0192
2013-09-20 15:42:25,315 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2013-09-20 15:42:25,352 [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 2999: Unexpected internal error. Index: 1, Size: 1
*********
但是当我尝试只复制两个字段时,它可以正常工作。下面的代码:
A= load '/home/1.txt' using PigStorage('\t') as (one:chararray,two:chararray);
STORE A INTO 'hbase://table1' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('P:one,P:two,S:three,S:four');
[s@namenode ~]$ hadoop version
Hadoop 0.20.2-cdh3u6
Hbase version : Version 0.90.6-cdh3u6
Pig version: Apache Pig version 0.8.1-cdh3u6 (rexported)
答案 0 :(得分:0)
A = LOAD' /home/1.txt'使用PigStorage(',')为(id:int,name:chararray,city:chararray);