作业提交失败,异常' org.apache.hadoop.util.DiskChecker $ DiskErrorException(任何本地目录中都没有空间。)'

时间:2016-04-26 16:24:26

标签: hadoop hive

当我运行hive查询时,我收到以下错误。请帮我解决这个问题。

HIVE> insert overwrite table bucket_emp1 select * from emp;

查询ID = hduser_20160426213038_58cbf1dc-a345-40f8-ab3d-a3258046b279 总工作量= 3 从3开始工作1 减少任务的数量设置为0,因为没有减少运算符 org.apache.hadoop.util.DiskChecker $ DiskErrorException:任何本地目录中都没有可用空间。     at org.apache.hadoop.fs.LocalDirAllocator $ AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:366)     at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)     at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)     at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)     at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)     在org.apache.hadoop.mapred.LocalJobRunner $ Job。(LocalJobRunner.java:163)     在org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)     在org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:536)     在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1296)     在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1293)     at java.security.AccessController.doPrivileged(Native Method)     在javax.security.auth.Subject.doAs(Subject.java:422)     在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)     在org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)     在org.apache.hadoop.mapred.JobClient $ 1.run(JobClient.java:562)     在org.apache.hadoop.mapred.JobClient $ 1.run(JobClient.java:557)     at java.security.AccessController.doPrivileged(Native Method)     在javax.security.auth.Subject.doAs(Subject.java:422)     在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)     在org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)     在org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)     在org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:431)     在org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)     在org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)     在org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)     在org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)     在org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)     在org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)     在org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)     在org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)     在org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)     在org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)     在org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)     在org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)     在org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)     在org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:498)     在org.apache.hadoop.util.RunJar.run(RunJar.java:221)     在org.apache.hadoop.util.RunJar.main(RunJar.java:136) 作业提交失败,异常' org.apache.hadoop.util.DiskChecker $ DiskErrorException(任何本地目录中都没有空间。)' FAILED:执行错误,从org.apache.hadoop.hive.ql.exec.mr.MapRedTask返回代码1

1 个答案:

答案 0 :(得分:0)

Map Reduce框架在处理作业期间查找mapreduce.cluster.local.dir参数指定的目录,之后它会验证目录上是否有足够的空间,以便它可以为它创建中间文件。

如果您的目录没有可用的空间,并且报告您共享的错误,则Map Reduce作业将失败。

确保本地目录上有足够的空间。

最好压缩(如Gzip compression)中间输出文件,以便在处理过程中占用更少的空间。

conf.set(“mapred.compress.map.output”, “true”)
conf.set(“mapred.output.compression.type”, “BLOCK”);
conf.set(“mapred.map.output.compression.codec”, “org.apache.hadoop.io.compress.GzipCodec”);