Flume:java.io.IOException:不是数据文件

时间:2017-02-22 17:26:54

标签: hadoop ioexception flume

今晚我们有一个磁盘空间已满问题,今天我在Flume日志中收到此错误:

22 Feb 2017 10:24:56,180 ERROR [pool-6-thread-1] (org.apache.flume.client.avro.ReliableSpoolingFileEventReader.openFile:504)  - Exception opening file: /.../flume_spool/data.../data_2017-02-21_17-15-00_8189
java.io.IOException: Not a data file.
        at org.apache.avro.file.DataFileStream.initialize(DataFileStream.java:102)
        at org.apache.avro.file.DataFileReader.<init>(DataFileReader.java:97)
        at org.apache.avro.file.DataFileWriter.appendTo(DataFileWriter.java:160)
        at org.apache.avro.file.DataFileWriter.appendTo(DataFileWriter.java:149)
        at org.apache.flume.serialization.DurablePositionTracker.<init>(DurablePositionTracker.java:141)
        at org.apache.flume.serialization.DurablePositionTracker.getInstance(DurablePositionTracker.java:76)
        at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.openFile(ReliableSpoolingFileEventReader.java:478)
        at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.getNextFile(ReliableSpoolingFileEventReader.java:459)
        at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:229)
        at org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:227)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Flume版本:1.5.2

1 个答案:

答案 0 :(得分:0)

java.io.IOException: Not a data file异常是由于存在保存元数据以进行处理的临时目录。

此目录由flume.conf中spooldir源定义中的trackerDir指令控制(默认情况下,在spooldir中为.flumespool)。

我们最终得到了空的元数据文件,然后没有预期会看到的avro(我们使用的是avro接收器)的2个字节。实际数据文件实际上没有任何错误,只有元数据文件。

因此,解决方案是删除.flumespool并解决问题(当然,在从磁盘释放一些空间之后。)

  1. 进入您的假脱机文件夹:/.../flume_spool/data...
  2. cmd:find . -type f -empty
  3. 我想你会发现这个:.flumespool/.flumespool-main.meta
  4. 然后rm .flumespool/.flumespool-main.meta
  5. Source