我在anaconda的testenv上安装了conda install -c conda-forge pyspark
(使用:/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark/python/pyspark
),它在这里(我认为)
spyder
此路径存在,接下来我开始(testenv1) ➜ ~ spyder
:
site-packeges
此代码产生以下错误,我认为import os
os.environ['SPARK_HOME'] = "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark" # Not working but also not sure why I need to add this line at all pyspark appears to be in `site-packages`
from pyspark import SparkConf, SparkContext
conf = SparkConf().setMaster("local").setAppName("WordCount")
sc = SparkContext(conf = conf)
会自动"包含",或者它是一个不同的问题?
runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')
Traceback (most recent call last):
File "<ipython-input-1-969f4e596614>", line 1, in <module>
runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')
File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 705, in runfile
execfile(filename, namespace)
File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 102, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
from pyspark import SparkConf, SparkContext
File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
from pyspark import SparkConf, SparkContext
ImportError: cannot import name 'SparkConf'
我得到以下错误:
/Users/myuser/anaconda3/envs/testenv1/bin/python3.6
请注意,我已尝试将spyder中的python解释器更新为:antialias
但我得到了同样的错误。
答案 0 :(得分:2)
python-snippets/pyspark.py
是你的档案吗?如果是,则不应使用名称pyspark.py
,因为它会与原始pyspark
包冲突。
请将文件重命名为其他内容并且应该可以正常工作