spark-submit python文件'home / .python-eggs'权限被拒绝

时间:2016-04-20 12:03:55

标签: python apache-spark permissions denied

当我使用spark-submit运行python文件时出现问题。当'map'代码在'executor'中运行时,问题是这样的:

Traceback (most recent call last):
File "/usr/lib64/python2.7/runpy.py", line 151, in _run_module_as_main
  mod_name, loader, code, fname = _get_module_details(mod_name)
File "/usr/lib64/python2.7/runpy.py", line 101, in _get_module_details
  loader = get_loader(mod_name)
File "/usr/lib64/python2.7/pkgutil.py", line 464, in get_loader
  return find_loader(fullname)
File "/usr/lib64/python2.7/pkgutil.py", line 474, in find_loader
  for importer in iter_importers(fullname):
File "/usr/lib64/python2.7/pkgutil.py", line 430, in iter_importers
  __import__(pkg)
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/__init__.py", line 41, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/context.py", line 35, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/rdd.py", line 51, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/shuffle.py", line 33, in <module>
File "build/bdist.linux-x86_64/egg/psutil/__init__.py", line 89, in <module>
File "build/bdist.linux-x86_64/egg/psutil/_pslinux.py", line 24, in <module>
File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 7, in <module>
File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 4, in __bootstrap__
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 945, in resource_filename
  self, resource_name
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1633, in get_resource_filename
  self._extract_resource(manager, self._eager_to_zip(name))
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1661, in _extract_resource
  self.egg_name, self._parts(zip_path)
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1025, in get_cache_path
  self.extraction_error()
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 991,     inextraction_error
  raise err
  pkg_resources.ExtractionError: Can't extract file(s) to egg cache
  The following error occurred while trying to extract file(s) to the Python egg
  cache:
    [Errno 13] Permission denied: '/home/.python-eggs' 
  The Python egg cache directory is currently set to:  
   /home/.python-eggs  
 Perhaps your account does not have write access to this directory?  You can
 change the cache directory by setting the PYTHON_EGG_CACHE environment
 variable to point to an accessible directory.

我将PYTHON_EGG_CACHE环境变量设置为每个执行器,并且我在程序中也写了'os.environ ['PYTHON_EGG_CACHE'] =“/ tmp /”',但问题仍然存在。

我的代码:

import os,sys
print "env::::"+os.environ['PYTHON_EGG_CACHE']
from pyspark import SparkConf, SparkContext
# Load and parse the data
def parsePoint(line):
    import os
    print "env::::"+os.environ['PYTHON_EGG_CACHE']
    os.environ['PYTHON_EGG_CACHE'] = "/tmp/"
    values = [float(x) for x in line.split(' ')]
    return line

if __name__ == "__main__":
    os.environ['PYTHON_EGG_CACHE'] = "/tmp/"
    print "env::::"+os.environ['PYTHON_EGG_CACHE']
    conf = SparkConf()
    sc = SparkContext(conf = conf)
    data = sc.textFile(sys.argv[1])
    parsedData = data.map(parsePoint)
    parsedData.collect()

我在'standalone'模型中运行这个python程序并成功了。 这是我的提交命令:

spark-submit --name test_py --master yarn-client testpy.py input/sample_svm_data.txt

纱线有问题吗?

3 个答案:

答案 0 :(得分:3)

这已经很晚了,但这是第一个结果@Google我发现这个问题......以前的答案很有用(我想知道我必须修改哪些env vars),但请不要修改编辑Spark源代码,只是使用适当的工具更改环境变量,将其添加到spark.conf变量...

GIDSignInButton

(我不想使用/ tmp /因为。在我的工作结束后会被删除,所以鸡蛋也应该消失了IMO)

答案 1 :(得分:0)

我解决了这个问题:  解压缩pyspark.zip然后找到rdd.py文件  打开此文件,在&#34;导入操作系统&#34;行,将代码添加为:

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

保存文件和zip pyspark

答案 2 :(得分:0)

我在BiS的答案帮助下解决了这个问题。通过在运行spark-submit时添加四个配置值,它修复了egg问题。

以下是使用spark-submit时添加四个参数的示例。

spark-submit \
    --conf spark.executorEnv.PYTHON_EGG_CACHE="./.python-eggs/" \
    --conf spark.executorEnv.PYTHON_EGG_DIR="./.python-eggs/" \
    --conf spark.driverEnv.PYTHON_EGG_CACHE="./.python-eggs/" \
    --conf spark.driverEnv.PYTHON_EGG_DIR="./.python-eggs/" \