使用Sparkit-Learn执行以下简单代码时:
from splearn.svm import SparkLinearSVC
spark=SparkLinearSVC()
我收到以下错误消息:
ImportError: pyspark home needs to be added to PYTHONPATH.
export PYTHONPATH=$PYTHONPATH:$SPARK_HOME/python:../
根据这些anserws: unable to add spark to PYTHONPATH importing pyspark in python shell 我已将这些PYTHONPATH的所有可能配置添加到我的.bashrc中,但错误仍然存在。
目前我的.bashrc路径如下:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH
export PATH=/home/123/anaconda2/bin:$PATH
export SPARK_HOME=/home/123/Downloads/spark-1.6.1-bin-hadoop2.6
export PATH=$SPARK_HOME/bin:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64/server:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64:$PATH
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH
任何可能的解决方案?