ValueError:使用pyspark不能在spark中一次运行多个SparkContexts

时间:2017-09-21 19:37:22

标签: python-3.x apache-spark pyspark

我是使用spark的新手,我尝试在pyspark上运行此代码

from pyspark import SparkConf, SparkContext
import collections

conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
sc = SparkContext(conf = conf)

但他直到我这个错误的消息

Using Python version 3.5.2 (default, Jul  5 2016 11:41:13)
SparkSession available as 'spark'.
>>> from pyspark import SparkConf, SparkContext
>>> import collections
>>> conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
>>> sc = SparkContext(conf = conf)



   Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\spark\python\pyspark\context.py", line 115, in __init__
        SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
      File "C:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
        callsite.function, callsite.file, callsite.linenum))
    ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at C:\spark\bin\..\python\pyspark\shell.py:43
    >>>

我有版本的火花2.1.1和python 3.5.2,我搜索并发现它是sc中的问题,他无法阅读但是没有为何直到为什么,任何人都在这里有帮助

3 个答案:

答案 0 :(得分:5)

您可以尝试:

sc = SparkContext.getOrCreate(conf=conf)

答案 1 :(得分:0)

你可以试试这个

sc = SparkContext.getOrCreate();

答案 2 :(得分:0)

您之前的会话仍在进行中。你可以运行

sc.stop()