使用MRJob更改Mapreduce中间输出位置

时间:2013-12-15 01:03:19

标签: python hadoop mapreduce hadoop-streaming mrjob

我正在尝试在我没有管理员权限的群集上使用MRJob运行python脚本,并且我在下面粘贴了错误。我认为正在发生的是该作业正在尝试将中间文件写入默认的/ tmp ....目录,因为这是一个我无权编写的受保护目录,该作业会收到错误并且退出。我想知道如何在本地文件系统示例中将此tmp输出目录位置更改为某个位置: / home/myusername/some_path_in_my_local_filesystem_on_the_cluster,基本上我想知道我需要传递哪些额外的参数来将中间输出位置从/ tmp / ...更改为我有写权限的本地。

我将脚本调用为:

python myscript.py  input.txt -r hadoop > output.txt

错误:

no configs found; falling back on auto-configuration
    no configs found; falling back on auto-configuration
    creating tmp directory /tmp/13435.1.all.q/mr_word_freq_count.myusername.20131215.004905.274232
    writing wrapper script to /tmp/13435.1.all.q/mr_word_freq_count.myusername.20131215.004905.274232/setup-wrapper.sh
    STDERR: mkdir: org.apache.hadoop.security.AccessControlException: Permission denied: user=myusername, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
    Traceback (most recent call last):
      File "/home/myusername/privatemodules/python/examples/mr_word_freq_count.py", line 37, in <module>
        MRWordFreqCount.run()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/job.py", line 500, in run
        mr_job.execute()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/job.py", line 518, in execute
        super(MRJob, self).execute()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/launch.py", line 146, in execute
        self.run_job()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/launch.py", line 207, in run_job
        runner.run()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/runner.py", line 458, in run
        self._run()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/hadoop.py", line 236, in _run
        self._upload_local_files_to_hdfs()
      File "/home/myusername/.local/lib/python2.7/site-packages/mrjob/hadoop.py", line 263, in _upload_local_files_to_hdfs
        self._mkdir_on_hdfs(self._upload_mgr.prefix)

1 个答案:

答案 0 :(得分:0)

您是将mrjob作为“本地”作业运行,还是尝试在Hadoop集群上运行它?

如果你真的想在Hadoop上使用它,你可以使用--base-tmp-dir标志控制“临时”HDFS位置(mrjob将存储中间文件):

python mr.py -r hadoop -o hdfs:///user/you/output_dir --base-tmp-dir hdfs:///user/you/tmp  hdfs:///user/you/data.txt