'JavaPackage'对象不可调用pyspark 2.3.0 Anaconda Win10

时间:2018-03-07 19:10:27

标签: pyspark anaconda py4j

我从pySpark开始。我在Win10中安装了anadonda。我复制了一个例子,当我执行代码时,我收到了这个错误:

Traceback (most recent call last):
File ".\testingSpark.py", line 7, in <module>
spark = SparkSession.builder.master("local").getOrCreate()
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\sql\session.py", line 173, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 331, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in __init__
conf, jsc, profiler_cls)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 188, in _do_init
self._javaAccumulator = self._jvm.PythonAccumulatorV2(host, port)
TypeError: 'JavaPackage' object is not callable

我已经阅读了它,但我找不到任何解决此错误的信息。拜托,帮助我!

0 个答案:

没有答案