为独立应用程序导入pyspark

时间:2015-02-09 11:29:11

标签: python apache-spark pyspark

我正在学习使用Spark。我跟着this文章到现在为止。当我尝试导入pyspark时,我收到以下错误。 pyspark中有一个文件accumulators.py。

>>> import os
>>> import sys
>>> os.environ['SPARK_HOME'] = "E:\\spark-1.2.0"
>>> sys.path.append("E:\\spark-1.2.0\\python")
>>> from pyspark import SparkContext
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "E:\spark-1.2.0\python\pyspark\__init__.py", line 41, in <module>
    from pyspark.context import SparkContext
  File "E:\spark-1.2.0\python\pyspark\context.py", line 30, in <module>
    from pyspark.java_gateway import launch_gateway
  File "E:\spark-1.2.0\python\pyspark\java_gateway.py", line 26, in <module>
    from py4j.java_gateway import java_import, JavaGateway, GatewayClient
ImportError: No module named py4j.java_gateway
>>> sys.path.append("E:\\spark-1.2.0\\python\\build")
>>> from pyspark import SparkContext
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "E:\spark-1.2.0\python\pyspark\__init__.py", line 41, in <module>
    from pyspark.context import SparkContext
  File "E:\spark-1.2.0\python\pyspark\context.py", line 25, in <module>
    from pyspark import accumulators
ImportError: cannot import name accumulators

如何解决此错误?我使用windows 7 and java-8。 python版本是Python 2.7.6 :: Anaconda 1.9.2 (64-bit)

2 个答案:

答案 0 :(得分:2)

我在同一篇文章后遇到了同样的问题,并且能够通过更改00-pyspark-setup.py脚本来修复它,将SPARK_HOME / python / lib路径添加到python的sys.path而不是SPARK_HOME / python直接。

我的完整00-pyspark-startup.py脚本现在如下:

import os
import sys

# Configure the environment
#if 'SPARK_HOME' not in os.environ:
#    os.environ['SPARK_HOME'] = '/srv/spark'

# Create a variable for our root path
SPARK_HOME = os.environ['SPARK_HOME']

# Add the PySpark/py4j to the Python Path
sys.path.insert(0, os.path.join(SPARK_HOME, "python", "lib"))
sys.path.insert(0, os.path.join(SPARK_HOME, "python"))

答案 1 :(得分:0)

尝试将E:\ spark-1.2.0 \ python \ lib \ py4j-0.8.2.1-src.zip添加到您的PYTHONPATH