我有一个包含各种位的docker镜像,包括Spark。这是我的Dockerfile:
FROM docker-dev.artifactory.company.com/centos:7.3.1611
# set proxy
ENV http_proxy http://proxyaddr.co.uk:8080
ENV HTTPS_PROXY http://proxyaddr.co.uk:8080
ENV https_proxy http://proxyaddr.co.uk:8080
RUN yum install -y epel-release
RUN yum install -y gcc
RUN yum install -y krb5-devel
RUN yum install -y python-devel
RUN yum install -y krb5-workstation
RUN yum install -y python-setuptools
RUN yum install -y python-pip
RUN yum install -y xmlstarlet
RUN yum install -y wget java-1.8.0-openjdk
RUN pip install kerberos
RUN pip install numpy
RUN pip install pandas
RUN pip install coverage
RUN pip install tensorflow
RUN wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
RUN tar -xvzf spark-1.6.0-bin-hadoop2.6.tgz -C /opt
RUN ln -s spark-1.6.0-bin-hadoop2.6 /opt/spark
ENV VERSION_NUMBER $(cat VERSION)
ENV JAVA_HOME /etc/alternatives/jre/
ENV SPARK_HOME /opt/spark
ENV PYTHONPATH $SPARK_HOME/python/:$PYTHONPATH
ENV PYTHONPATH $SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH
我可以构建然后运行该docker镜像,连接到它,并成功导入pyspark库:
$ docker run -d -it sse_spark_build:1.0
09e8aac622d7500e147a6e6db69f806fe093b0399b98605c5da2ff5e0feca07c
$ docker exec -it 09e8aac622d7 python
Python 2.7.5 (default, Nov 6 2016, 00:28:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from pyspark import SparkContext
>>>import os
>>> os.environ['PYTHONPATH']
'/opt/spark/python/lib/py4j-0.9-src.zip:/opt/spark/python/:'
>>>
请注意PYTHONPATH
!
问题是如果我使用相同的docker镜像作为解释器,PyCharm中的行为是不同的。以下是我设置解释器的方法:
如果我在PyCharm中运行Python控制台,则会发生这种情况:
bec0b9189066:python /opt/.pycharm_helpers/pydev/pydevconsole.py 0 0
PyDev console: starting.
import sys; print('Python %s on %s' % (sys.version, sys.platform))
sys.path.extend(['/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark', '/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark'])
Python 2.7.5 (default, Nov 6 2016, 00:28:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
import os
os.environ['PYTHONPATH']
'/opt/.pycharm_helpers/pydev'
正如您所看到的,PyCharm已经改变了PYTHONPATH,这意味着我不能再使用我想要使用的pyspark库了:
from pyspark import SparkContext
Traceback (most recent call last):
File "<input>", line 1, in <module>
ImportError: No module named pyspark
好的,我可以从控制台更改PATH以使其正常工作:
import sys
sys.path.append('/opt/spark/python/')
sys.path.append('/opt/spark/python/lib/py4j-0.9-src.zip')
但每次打开控制台时都必须这样做很乏味。我无法相信没有办法告诉PyCharm附加到PYTHONPATH而不是覆盖它,但如果有的话我找不到它。有人可以提供任何建议吗?如何使用Docker镜像作为PyCharm的远程解释器并保持PYTHONPATH的值?
答案 0 :(得分:3)
您可以设置环境变量,也可以更新“启动脚本”部分。无论哪种方式更适合你,两者都可以胜任
如果您需要进一步的帮助,请阅读以下文章 https://www.jetbrains.com/help/pycharm/python-console.html