SparkContext函数在cmd和Jupyter中的pyspark中返回错误
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
我试过了:
>>> from pyspark import SparkContext
>>> sc = SparkContext()
但仍然显示错误:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "c:\spark\python\pyspark\context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "c:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
**ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app
=PySparkShell, master=local[*]) created by getOrCreate at c:\spark\bin\..\python
\pyspark\shell.py:43**strong text****
如何解决问题?
答案 0 :(得分:1)
你可能有另一个执行pySpark的笔记本,你可以使用 SparkContext.getOrCreate()。
问候。