我试图在HDFS中放置大型数据集(大约200个文件夹)。
但我遇到了错误:
WARN hdfs.DFSClient: Slow waitForAckedSeqno took 72699ms;
INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.111.3:50010;
java.io.IOException: Got error, status message, ask with firstBadLink as 192.168.111.3:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:140)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1363)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
我使用此命令获取文件夹数量,而不是一次使用:hdfs dfs -put "eache folder" /hadoopPath
是否有解决这些错误的解决方案?