我要从 HDFS 复制 800个avro文件(大小约为 136 MB )到上的 S3 > EMR群集,但是我遇到了这个异常:
8/06/26 10:53:14 INFO mapreduce.Job: map 100% reduce 91%
18/06/26 10:53:14 INFO mapreduce.Job: Task Id : attempt_1529995855123_0003_r_000006_0, Status : FAILED
Error: java.lang.RuntimeException: Reducer task failed to copy 1 files: hdfs://url-to-aws-emr/user/hadoop/output/part-00258-3a28110a-9270-4639-b389-3e1f7f386ed6-c000.avro etc
at com.amazon.elasticmapreduce.s3distcp.CopyFilesReducer.cleanup(CopyFilesReducer.java:67)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:179)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:635)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:390)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
EMR群集的配置为:
core-site fs.trash.checkpoint.interval 60
core-site fs.trash.interval 60
hadoop-env.export HADOOP_CLIENT_OPTS -Xmx10g
hdfs-site dfs.replication 3
任何帮助将不胜感激。
编辑:
运行 hdfs dfsadmin -report 命令,将得到以下结果:
[hadoop@~]$ hdfs dfsadmin -report
Configured Capacity: 79056308744192 (71.90 TB)
Present Capacity: 78112126204492 (71.04 TB)
DFS Remaining: 74356972374604 (67.63 TB)
DFS Used: 3755153829888 (3.42 TB)
DFS Used%: 4.81%
Under replicated blocks: 126
Blocks with corrupt replicas: 0
Missing blocks: 63
Missing blocks (with replication factor 1): 0
这表明该块丢失了。这是否意味着我必须再次重新运行该程序?如果我看到Under复制块的输出,则显示为126。这意味着将复制126个块。我怎么知道,它会复制丢失的块吗?
此外,最近30分钟内正在复制的数据块的值为126。有没有办法强迫它快速复制?
答案 0 :(得分:0)
我遇到了相同的“ Reducer任务无法复制1个文件”错误,我在HDFS / var / log / hadoop-yarn / apps / hadoop / s中发现了与s3-dist-cp启动的MR作业相关的日志。
hadoop fs -ls /var/log/hadoop-yarn/apps/hadoop/logs
我将它们复制到本地:
hadoop fs -get /var/log/hadoop-yarn/apps/hadoop/logs/application_nnnnnnnnnnnnn_nnnn/ip-nnn-nn-nn-nnn.ec2.internal_nnnn
,然后在文本编辑器中对其进行检查,以找到有关Reducer阶段详细结果的更多诊断信息。就我而言,我从S3服务返回了一个错误。您可能会发现其他错误。