使用IPython时出现PySpark异常

时间:2016-04-14 07:00:48

标签: apache-spark ipython pyspark

我在ubuntu 12.04中安装了PySpark和Ipython笔记本。

安装后运行“ipython --profile = pyspark”时,它会抛出以下异常

ubuntu_user@ubuntu_user-VirtualBox:~$ ipython --profile=pyspark  
Python 2.7.3 (default, Jun 22 2015, 19:33:41) 
Type "copyright", "credits" or "license" for more information.

IPython 0.12.1 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.

IPython profile: pyspark
Error: Must specify a primary resource (JAR or Python or R file)
Run with --help for usage help or --verbose for debug output
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
/usr/lib/python2.7/dist-packages/IPython/utils/py3compat.pyc in execfile(fname, *where)
    173             else:
    174                 filename = fname
--> 175             __builtin__.execfile(filename, *where)

/home/ubuntu_user/.config/ipython/profile_pyspark/startup/00-pyspark-setup.py in <module>()
      6 sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-src.zip'))
      7 
----> 8 execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))
      9 

/home/ubuntu_user/spark/python/pyspark/shell.py in <module>()
     41     SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
     42 
---> 43 sc = SparkContext(pyFiles=add_files)
     44 atexit.register(lambda: sc.stop())
     45 

/home/ubuntu_user/spark/python/pyspark/context.pyc in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
    108         """
    109         self._callsite = first_spark_call() or CallSite(None, None, None)
--> 110         SparkContext._ensure_initialized(self, gateway=gateway)
    111         try:
    112             self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,

/home/ubuntu_user/spark/python/pyspark/context.pyc in _ensure_initialized(cls, instance, gateway)
    232         with SparkContext._lock:
    233             if not SparkContext._gateway:
--> 234                 SparkContext._gateway = gateway or launch_gateway()
    235                 SparkContext._jvm = SparkContext._gateway.jvm
    236 

/home/ubuntu_user/spark/python/pyspark/java_gateway.pyc in launch_gateway()
     92                 callback_socket.close()
     93         if gateway_port is None:
---> 94             raise Exception("Java gateway process exited before sending the driver its port number")
     95 
     96         # In Windows, ensure the Java child processes do not linger after Python has exited.


Exception: Java gateway process exited before sending the driver its port number

以下是设置和配置文件。

ubuntu_user@ubuntu_user-VirtualBox:~$ ls /home/ubuntu_user/spark
bin          ec2       licenses  README.md
CHANGES.txt  examples  NOTICE    RELEASE
conf         lib       python    sbin
data         LICENSE   R         spark-1.5.2-bin-hadoop2.6.tgz

以下是IPython设置

ubuntu_user@ubuntu_user-VirtualBox:~$ ls .config/ipython/profile_pyspark/
db              ipython_config.py           log  security
history.sqlite  ipython_notebook_config.py  pid  startup

IPython和Spark(PySpark)配置

ubuntu_user@ubuntu_user-VirtualBox:~$ vi .config/ipython/profile_pyspark/ipython_notebook_config.py

# Configuration file for ipython-notebook.

c = get_config()

# IPython PySpark
c.NotebookApp.ip = 'localhost'
c.NotebookApp.open_browser = False
c.NotebookApp.port = 7770


ubuntu_user@ubuntu_user-VirtualBox:~$ vi .config/ipython/profile_pyspark/startup/00-pyspark-setup.py
import os
import sys

spark_home = os.environ.get('SPARK_HOME', None)
sys.path.insert(0, spark_home + "/python")
sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-src.zip'))

execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))

在.bashrc或.bash_profile中设置以下环境变量:

ubuntu_user@ubuntu_user-VirtualBox:~$ vi .bashrc 
export SPARK_HOME="/home/ubuntu_user/spark"
export PYSPARK_SUBMIT_ARGS="--master local[2]"

我是apache spark和IPython的新手。如何解决这个问题?

3 个答案:

答案 0 :(得分:0)

当我的虚拟机没有足够的Java内存时,我遇到了同样的异常。所以我为我的虚拟机分配了更多的内存,这个例外就消失了。

步骤:关闭VM - &gt; VirtualBox设置 - &gt; “系统”标签 - &gt;设置内存

(但是,这可能只是一种解决方法。我想解决此异常的正确方法可能是根据java内存正确配置Spark。)

答案 1 :(得分:0)

可能是火花找到pyspark shell时出错。

export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

这适用于Spark 1.6.1。如果您有不同的版本,请尝试查找.zip文件并添加提取的路径。

答案 2 :(得分:0)

两个想法: 你的JDK在哪里?我没有在您的文件中看到配置了JAVA_HOME参数。这可能足够了:

Error: Must specify a primary resource (JAR or Python or R file)

其次,确保您的端口7770已打开并可供JVM使用。