在创建spark上下文时获取错误py4j.protocol.Py4JNetworkError:尝试连接到Java服务器时发生错误

时间:2016-03-08 08:55:24

标签: apache-spark cassandra pyspark

我正在尝试使用pyspark上的以下命令创建spark上下文对象:

from pyspark import SparkContext, SparkConf
conf = SparkConf().setAppName('App_name').setMaster("spark://local-or-remote-ip:7077").set('spark.cassandra.connection.host', 'cassandra-machine-ip').set('spark.storage.memoryFraction', '0.2').set('spark.rdd.compress', 'true').set('spark.streaming.blockInterval', 500).set('spark.serializer', 'org.apache.spark.serializer.KryoSerializer').set('spark.scheduler.mode', 'FAIR').set('spark.mesos.coarse', 'true')
sc = SparkContext(conf=conf)

但是我收到以下错误:

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/spark-1.4.1/python/pyspark/conf.py", line 106, in __init__
  self._jconf = _jvm.SparkConf(loadDefaults)
File "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 766, in __getattr__
File "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 362, in send_command
File "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 318, in _get_connection
File "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 325, in _create_connection
File "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 432, in start
py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to the Java server

我已经使用sbt构建了spark jar,当我通过scala运行它时出现了这个错误:

ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[sparkDriver-akka.actor.default-dispatcher-18,5,main] org.apache.spark.SparkException: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up

虽然师傅在工作。

0 个答案:

没有答案