我正在尝试未在本地计算机Spark Context
上运行pyspark,这会以某种方式从Spark SQL库之一引发错误。
(dataplot) name@name:~$ pyspark
Python 3.6.7 (default, Oct 22 2018, 11:32:17)
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/python/pyspark/shell.py", line 31, in <module>
from pyspark import SparkConf
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/__init__.py", line 51, in <module>
from pyspark.context import SparkContext
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/context.py", line 43, in <module>
from pyspark.profiler import ProfilerCollector, BasicProfiler
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/profiler.py", line 18, in <module>
import cProfile
File "/usr/lib/python3.6/cProfile.py", line 10, in <module>
import profile as _pyprofile
File "/home/inkadmin/profile.py", line 12, in <module>
from pyspark.sql import SparkSession
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/sql/__init__.py", line 45, in <module>
from pyspark.sql.types import Row
File "/home/inkadmin/virtualenvs/dataplot/lib/python3.6/site-packages/pyspark/sql/types.py", line 36, in <module>
from pyspark import SparkContext
ImportError: cannot import name 'SparkContext'
>>>