权限被拒绝错误13 - Hadoop上的Python

时间:2015-09-23 09:22:13

标签: python-2.7 hadoop-streaming

我正在运行一个简单的Python mapper和reducer,并且正在获取13 permission denied error。需要帮助。

我不确定这里发生了什么,需要帮助。 Hadoop世界的新手。

我正在运行简单的地图缩减以计算单词。 mapper和reducer在linus或windows powershell上独立运行

======================================================================


hadoop@ubuntu:~/hadoop-1.2.1$ bin/hadoop jar contrib/streaming/hadoop-streaming-1.2.1.jar -file /home/hadoop/mapper.py -mapper mapper.py -file /home/hadoop/reducer.py -reducer reducer.py -input /deepw/pg4300.txt -output /deepw/pg3055
Warning: $HADOOP_HOME is deprecated.

packageJobJar: [/home/hadoop/mapper.py, /home/hadoop/reducer.py, /tmp/hadoop-hadoop/hadoop-unjar2961168567699201508/] [] /tmp/streamjob4125164474101219622.jar tmpDir=null
15/09/23 14:39:16 INFO util.NativeCodeLoader: Loaded the native-hadoop library
15/09/23 14:39:16 WARN snappy.LoadSnappy: Snappy native library not loaded
15/09/23 14:39:16 INFO mapred.FileInputFormat: Total input paths to process : 1
15/09/23 14:39:16 INFO streaming.StreamJob: getLocalDirs(): [/tmp/hadoop-hadoop/mapred/local]
15/09/23 14:39:16 INFO streaming.StreamJob: Running job: job_201509231312_0003
15/09/23 14:39:16 INFO streaming.StreamJob: To kill this job, run:
15/09/23 14:39:16 INFO streaming.StreamJob: /home/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=192.168.56.102:9001 -kill job_201509231312_0003
15/09/23 14:39:16 INFO streaming.StreamJob: Tracking URL: http://192.168.56.102:50030/jobdetails.jsp?jobid=job_201509231312_0003
15/09/23 14:39:17 INFO streaming.StreamJob: map 0% reduce 0%
15/09/23 14:39:41 INFO streaming.StreamJob: map 100% reduce 100%
15/09/23 14:39:41 INFO streaming.StreamJob: To kill this job, run:
15/09/23 14:39:41 INFO streaming.StreamJob: /home/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=192.168.56.102:9001 -kill job_201509231312_0003
15/09/23 14:39:41 INFO streaming.StreamJob: Tracking URL: http://192.168.56.102:50030/jobdetails.jsp?jobid=job_201509231312_0003
15/09/23 14:39:41 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201509231312_0003_m_000000
15/09/23 14:39:41 INFO streaming.StreamJob: killJob...
Streaming Command Failed!

================================================================
java.io.IOException: Cannot run program "/tmp/hadoop-hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201509231312_0003/attempt_201509231312_0003_m_000001_3/work/./mapper.py": error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:366)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: error=13, Permission denied
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 24 more

4 个答案:

答案 0 :(得分:2)

您的映射器文件似乎不可执行。在提交作业之前尝试chmod a+x mapper.py

或者,在您的命令中,您可以替换

-mapper mapper.py

-mapper "python mapper.py"

答案 1 :(得分:2)

作为一个说明,我最近也有这个错误13的问题。但是,在我的情况下,问题是python可执行文件和mappers / redurs所在的目录有权限问题;其他人无法读懂。在chmod a + rx之后,我的问题得到解决。

答案 2 :(得分:0)

在为mapper执行chmod a+x并减少.py文件后,我得到异常(将python关键字添加到mapper中,它可以正常工作并产生正确的结果)。

========================================================================================

5-09-28 13:25:16,572 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
2015-09-28 13:25:16,752 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/META-INF <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/META-INF
2015-09-28 13:25:16,761 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/reducer.py <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/reducer.py
2015-09-28 13:25:16,763 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/job.jar <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/job.jar
2015-09-28 13:25:16,766 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/.job.jar.crc <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/.job.jar.crc
2015-09-28 13:25:16,769 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/org <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/org
2015-09-28 13:25:16,771 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/jars/mapper.py <- /deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/mapper.py
2015-09-28 13:25:17,046 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already exists!
2015-09-28 13:25:17,176 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2015-09-28 13:25:17,184 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1e7c7fb
2015-09-28 13:25:17,254 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://192.168.56.101:9000/swad/4300.txt:0+786539
2015-09-28 13:25:17,275 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library not loaded
2015-09-28 13:25:17,287 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 1
2015-09-28 13:25:17,296 INFO org.apache.hadoop.mapred.MapTask: io.sort.mb = 100
2015-09-28 13:25:17,393 INFO org.apache.hadoop.mapred.MapTask: data buffer = 79691776/99614720
2015-09-28 13:25:17,393 INFO org.apache.hadoop.mapred.MapTask: record buffer = 262144/327680
2015-09-28 13:25:17,419 INFO org.apache.hadoop.streaming.PipeMapRed: PipeMapRed exec [/deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/./mapper.py]
2015-09-28 13:25:17,436 ERROR org.apache.hadoop.streaming.PipeMapRed: configuration exception
java.io.IOException: Cannot run program "/deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/./mapper.py": error=2, No such file or directory
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
    at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
    at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:366)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: error=2, No such file or directory
    at java.lang.UNIXProcess.forkAndExec(Native Method)
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
    at java.lang.ProcessImpl.start(ProcessImpl.java:130)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
    ... 24 more
2015-09-28 13:25:17,462 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2015-09-28 13:25:17,495 INFO org.apache.hadoop.io.nativeio.NativeIO: Initialized cache for UID to User mapping with a cache timeout of 14400 seconds.
2015-09-28 13:25:17,496 INFO org.apache.hadoop.io.nativeio.NativeIO: Got UserName hadoop for UID 1000 from the native implementation
2015-09-28 13:25:17,498 WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:366)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
    ... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
    ... 14 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
    ... 17 more
Caused by: java.lang.RuntimeException: configuration exception
    at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230)
    at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
    ... 22 more
Caused by: java.io.IOException: Cannot run program "/deep/mapred/local/taskTracker/hadoop/jobcache/job_201509281234_0015/attempt_201509281234_0015_m_000000_3/work/./mapper.py": error=2, No such file or directory
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
    at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
    ... 23 more
Caused by: java.io.IOException: error=2, No such file or directory
    at java.lang.UNIXProcess.forkAndExec(Native Method)
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
    at java.lang.ProcessImpl.start(ProcessImpl.java:130)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
    ... 24 more
2015-09-28 13:25:17,506 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

答案 3 :(得分:0)

我也在努力解决这个问题。我发现当我在单个节点上运行时,Cloudera QuickStart VM,它都可以工作,但在群集上却没有。似乎python脚本没有被运送到节点执行。

还有另一个参数“-file”,它将文件或目录作为作业的一部分提供。这里提到:

https://wiki.apache.org/hadoop/HadoopStreaming

您可以多次指定此文件,一次用于映射器,另一次用于reducer,如下所示:

hadoop jar /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop-mapreduce/hadoop-streaming.jar -input / user / linux / input -output / user / linux / output_new -mapper wordcount_mapper.py -reducer wordcount_reducer.py -file /home/linux/wordcount_mapper.py -file /home/linux/wordcount_reducer.py

或者您可以将脚本打包到目录中并仅发送目录,如下所示:

hadoop jar /opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/hadoop-mapreduce/hadoop-streaming.jar -input / user / linux / input -output / user / linux / output_new -mapper wc / wordcount_mapper.py -reducer wc / wordcount_reducer.py -file / home / linux / wc

请注意,我指的是mapper和reducer脚本以及相对路径。

关于文件可读和可执行的评论也是正确的。

我花了一段时间来解决这个问题。我希望它有所帮助。