Flume给出错误

时间:2015-11-05 20:25:46

标签: hadoop

有人可以帮我解决这个问题。我正在经营一名水槽代理商。运行后它给我一个例外,如下所示。

  

超出磁盘配额(122)at   com.mapr.fs.MapRClientImpl.create(MapRClientImpl.java:159)at at   com.mapr.fs.MapRFileSystem.create(MapRFileSystem.java:640)at at   com.mapr.fs.MapRFileSystem.create(MapRFileSystem.java:682)at at   org.apache.hadoop.fs.FileSystem.create(FileSystem.java:809)at at   org.apache.flume.sink.hdfs.HDFSDataStream.doOpen(HDFSDataStream.java:86)   在   org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:113)   在   org.apache.flume.sink.hdfs.BucketWriter $ 1.call(BucketWriter.java:275)   在   org.apache.flume.sink.hdfs.BucketWriter $ 1.call(BucketWriter.java:264)   在   org.apache.flume.sink.hdfs.BucketWriter $ 9 $ 1.run(BucketWriter.java:720)   在   org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:183)   在   org.apache.flume.sink.hdfs.BucketWriter.access $ 1700(BucketWriter.java:59)   在   org.apache.flume.sink.hdfs.BucketWriter $ 9.call(BucketWriter.java:717)   在java.util.concurrent.FutureTask.run(FutureTask.java:262)at   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)   在   java.util.concurrent.ThreadPoolExecutor中的$ Worker.run(ThreadPoolExecutor.java:615)   在java.lang.Thread.run(Thread.java:745)

0 个答案:

没有答案