pyspark安装错误,“ImportError:没有名为pyspark的模块”

时间:2016-05-27 12:10:12

标签: apache-spark pyspark apache-spark-1.6

我正在尝试将apache spark-1.6.1作为独立模式安装。我关注了“https://github.com/KristianHolsheimer/pyspark-setup-guide”链接。 但是,执行

之后
$ sbt/sbt assembly

我试过了

$ ./bin/run-example SparkPi 10"

但是,它出错了,

./bin/run-example: line 26: /home/dasprasun/opt/spark/bin/load-spark env.sh: No such file or directory
Failed to find Spark examples assembly in /home/dasprasun/opt/spark/lib or /home/dasprasun/opt/spark/examples/target
You need to build Spark before running this program

完成所有步骤后,我在ipython中提供了以下命令

In [1]: from pyspark import SparkContext

它出现以下错误:

ImportError Traceback (most recent call last) <ipython-input-1-47c4965c5f0e> in <module>()
----> 1 from pyspark import SparkContext
ImportError: No module named pyspark

我不明白发生了什么。请帮我解决这个问题。

0 个答案:

没有答案