Amazon EMR上的Pig - 将输出存储到文件

时间:2016-03-20 21:14:26

标签: hadoop amazon-web-services amazon-s3 apache-pig store

我目前正在Amazon Elastic map上使用Pig减少并尝试执行一项简单的任务:从s3输入一些数据,将其转换并输出到文件中。但是,当我使用STORE命令时,我遇到了一些问题。

这是我的代码:

cp s3://stackexchangedata/Data/Query_1-50000.csv file:///home/hadoop
REGISTER 'file:///home/hadoop/piggybank.jar'
RAW_LOGS1 = LOAD 'file:///home/hadoop/Query_1-50000.csv' USING org.apache.pig.piggybank.storage.CSVExcelStorage(',', 'YES_MULTILINE') as (Id:Long, PostTypeID:chararray, AcceptedAnswerID:chararray, ParentID:chararray, CreationDate:chararray, DeletionDate:chararray,  Score:long, ViewCount:long, Body:chararray, OwnerUserID:chararray, OwnerDisplayName:chararray, LastEditorUserId:chararray, LastEditorDisplayName:chararray, LastEditDate:chararray, LastActivityDate:chararray, Title:chararray, Tags:chararray, AnswerCount:int, CommentCount:int, FavoriteCount:int, ClosedDate:chararray, CommunityOwnedDate:chararray);

RAW_LOGS1A = FOREACH RAW_LOGS1 GENERATE $0, $1, $2, $3, $4, $5, $6, $7, REPLACE(Body, '\n','') AS Body_Clean, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21;

STORE RAW_LOGS1A INTO 'file:///home/hadoop/test/'; 

REGISTER,LOAD和GENERATE命令似乎工作但是STORE命令不起作用 - 这有很多输出,所以我刚刚复制了WARNING和ERROR位。

16/03/20 21:03:22 WARN mapreduce.JobResourceUploader: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).

org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Wrong FS: hdfs://ip-172-31-21-40.eu-west-1.compute.internal:8020/user/hadoop, expected: file:///
   at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
    at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:194)
    at java.lang.Thread.run(Thread.java:745)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://ip-172-31-21-40.eu-west-1.compute.internal:8020/user/hadoop, expected: file:///
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:650)
    at org.apache.hadoop.fs.RawLocalFileSystem.setWorkingDirectory(RawLocalFileSystem.java:547)
    at org.apache.hadoop.fs.FilterFileSystem.setWorkingDirectory(FilterFileSystem.java:290)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:235)
    ... 18 more

有人可以帮忙吗?我对Hadoop很新,所以非常感谢任何帮助。

提前致谢。

0 个答案:

没有答案