我没有太多成功搞清楚这个错误信息的含义。我也是HDFS和HBase的新手,所以这是问题的一部分。除了HDFS服务器空间不足的可能性之外,可能会导致此错误:
2014-06-13 12:55:33,164 WARN org.apache.hadoop.hbase.regionserver.wal.HLogSplitter:
Could not open hdfs://<OURSERVER>:8020/hbase/.logs/<HBASE_BOX>,60020,1402678303659-splitting/<HBASE_BOX>m%2C60020%2C1402678303659.1402678319050 for reading. File is empty
java.io.EOFException
at java.io.DataInputStream.readFully(Unknown Source)
at java.io.DataInputStream.readFully(Unknown Source)
at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1800)
at org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1765)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1714)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728)
at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader$WALReader.<init>(SequenceFileLogReader.java:55)
at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.init(SequenceFileLogReader.java:178)
at org.apache.hadoop.hbase.regionserver.wal.HLog.getReader(HLog.java:745)
at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.getReader(HLogSplitter.java:855)
at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.getReader(HLogSplitter.java:768)
at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLogFile(HLogSplitter.java:412)
at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLogFile(HLogSplitter.java:380)
at org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:115)
at org.apache.hadoop.hbase.regionserver.SplitLogWorker.grabTask(SplitLogWorker.java:283)
at org.apache.hadoop.hbase.regionserver.SplitLogWorker.taskLoop(SplitLogWorker.java:214)
at org.apache.hadoop.hbase.regionserver.SplitLogWorker.run(SplitLogWorker.java:182)
at java.lang.Thread.run(Unknown Source)
答案 0 :(得分:1)
您可以通过fsck
查看HDFS的状态(并修正错误),请参阅http://hadoop.apache.org/docs/r2.4.0/hadoop-project-dist/hadoop-common/CommandsManual.html#fsck
完成后,您可以使用hbck
查看HBase的状态,请参阅http://hbase.apache.org/book/hbck.in.depth.html
答案 1 :(得分:0)
问题是该特定HDFS节点上缺少磁盘空间。