Py4JError:JVM

时间:2015-12-10 02:51:32

标签: apache-spark pyspark

我正在运行pyspark,但有时可能会不稳定。它有几次在此命令崩溃

spark_conf = SparkConf()

带有以下错误消息

     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__
self._jconf = _jvm.SparkConf(loadDefaults)
     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 772, in __getattr__
raise Py4JError('{0} does not exist in the JVM'.format(name))
     Py4JError: SparkConf does not exist in the JVM

知道问题是什么?谢谢你的帮助!

1 个答案:

答案 0 :(得分:1)

在pyspark上下文中不存在SparkConf,请尝试:

from pyspark import SparkConf

在pyspark控制台或代码中。