在Mac OS上安装Spark

时间:2018-10-12 03:56:38

标签: python scala apache-spark

尝试在Mac上安装Spark和相关程序,但在测试安装时收到错误消息。

/用户/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/pyspark /Users/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/spark-class:第71行:/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java :没有这样的文件或目录

从我的bash_profile条目中...

export JAVA_HOME = /库/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home /

export SPARK_HOME = /用户/目录/apachespark/spark-2.3.0-bin-hadoop2.7

export SBT_HOME = /用户/目录/ apachespark / sbt

export SCALA_HOME = /用户/目录/apachespark/scala-2.11.12

export PATH = $ JAVA_HOME / bin:$ SBT_HOME / bin:$ SBT_HOME / lib:$ SCALA_HOME / bin:$ SCALA_HOME / lib:$ PATH

export PATH = $ JAVA_HOME / bin:$ SPARK_HOME:$ SPARK_HOME / bin:$ SPARK_HOME / sbin:$ PATH

导出PYSPARK_PYTHON = python3

PATH =“ / Library / Frameworks / Python.framework / Versions / 3.6 / bin:$ {PATH}” 导出路径

更正建议?谢谢。

1 个答案:

答案 0 :(得分:1)

如报告的错误消息所示:

/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java: No such file or directory

由于$JAVA_HOME/bin中结尾的/,Java可执行文件/的文件路径会产生额外的JAVA_HOME

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home/

删除/中的尾随JAVA_HOME应该可以解决此问题。更好的是,如下所示设置JAVA_HOME会自动指向Mac OSX上的活动JDK版本:

export JAVA_HOME=$(/usr/libexec/java_home)