无法在sparkversion 2.0.1中创建sparkcontext

时间:2017-02-20 07:37:21

标签: python apache-spark pyspark pyspark-sql

我最近将我的火花版本从1.5升级到2.0.1。通过python脚本停止工作。

1.5中的代码正在运行

sc=SparkContext(appName="YOGI")

在spark 2.0.1中修改了代码

sc =SparkContext().master("spark://107.110.74.58:7077").appName("Python Spark SQL basic example").getOrCreate()


File "/home/yogendra.s/codebase/processRawData.py", line 56, in <module>
    sc =SparkContext().master("spark://107.110.74.58:7077").appName("Python Spark SQL basic example").getOrCreate()
  File "/home/yogendra.s/.spark_update/spark_hadoop2_7/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/home/yogendra.s/.spark_update/spark_hadoop2_7/python/lib/pyspark.zip/pyspark/context.py", line 174, in _do_init
  File "/home/yogendra.s/.spark_update/spark_hadoop2_7/python/lib/pyspark.zip/pyspark/accumulators.py", line 259, in _start_update_server
  File "/usr/lib/python2.7/SocketServer.py", line 420, in __init__
    self.server_bind()
  File "/usr/lib/python2.7/SocketServer.py", line 434, in server_bind
    self.socket.bind(self.server_address)
  File "/usr/lib/python2.7/socket.py", line 224, in meth
    return getattr(self._sock,name)(*args)
socket.gaierror: [Errno -2] Name or service not known



Content of my default.xml
spark.master                       spark://107.110.74.58:7077
spark.driver.memory                20g
spark.executor.memory              20g

1 个答案:

答案 0 :(得分:1)

审核您的代码:

  

sc =   SparkContext()主( “火花://107.110.74.58:7077”)。.appName(“Python的   Spark SQL基本示例“)。getOrCreate()

您应该尝试使用.setMaster代替.master

Spark documents建议:

conf = SparkConf().setAppName(appName).setMaster(master)
sc = SparkContext(conf=conf)

在你的情况下试试:

from pyspark import SparkContext, SparkConf

conf = SparkConf().setAppName("Python Spark SQL basic example").setMaster("spark://107.110.74.58:7077")
sc = SparkContext(conf=conf)

请注意,我已删除了.getOrCreate()部分