我在安装了java的macbook中使用了python 3.6,我下载了spark 2.3.1,但无法成功安装spark。顺便说一句,我成功地使用pip3 install pyspark。我不知道是怎么回事!! 我在终端上附加了bash_profile和错误!
在bash_profile中
export java_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/HOme/"
export SPARK_HOME="/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/pyspark"
export PATH=$JAVA_HOME/bin:$SPARK_HOME:$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH
export PYSPARK_PYTHON=python3
在终端上,它表明
/Library/Frameworks/Python.framework/Versions/3.6/bin/pyspark: line 24: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site packages/pyspark/bin/load-spark-env.sh: No such file or directory
/Library/Frameworks/Python.framework/Versions/3.6/bin/pyspark: line 77: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site packages/pyspark/bin/spark-submit: No such file or directory
/Library/Frameworks/Python.framework/Versions/3.6/bin/pyspark: line 77: exec: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site packages/pyspark/bin/spark-submit: cannot execute: No such file or directory
答案 0 :(得分:0)
您可以在/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/pyspark中验证是否有spark文件 您应该在下面有以下文件夹: LICENSE R RELEASE conf示例kubernetes python yarn 注意README.md bin数据罐许可证sbin
如果不下载spark软件包并解压缩到某个X目录。如果提取的文件夹名称为Y,则将SPARK_HOME设置为目录Y。 就我而言,它是这样的:/Users/rs1223/spark_package/spark-2.3.2-bin-hadoop2.7
此重试后,运行pyspark。应该可以。