火花壳,pyspark不能正常工作

时间:2018-03-22 17:26:59

标签: apache-spark

我在Ubuntu上安装了spark-2.3.0-bin-hadoop2.7,我觉得它在java路径上没有问题。当我运行“spark-submit --version”或“spark-shell”或“pyspark”时,我收到以下错误:

/usr/local/spark-2.3.0-bin-hadoop2.7/bin/spark-class: line 71: /usr/lib/jvm/java-8-openjdk-amd-64/jre/bin/java: No such file or directory

似乎“/ bin / java”存在问题,但我不确定在何处更改配置。 spark类文件包含以下行:

if [ -n "${JAVA_HOME}" ]; then RUNNER="${JAVA_HOME}/bin/java

/ etc / environment是:

bash: /etc/environment: Permission denied

我现在在gedit~ / .bashrc中的内容是:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64/jre

export PATH=$PATH:JAVA_HOME/bin

这是我当前的Java设置:

root@ubuntu:~# update-alternatives --config java There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java Nothing to configure.

bashrc有以下内容:

export PATH=$PATH:/usr/share/scala-2.11.8/bin

export SPARK_HOME=/usr/local/spark-2.3.0-bin-hadoop2.7

export PATH=$PATH:$SPARK_HOME/bin

建议我:

  1. 我需要更改哪些文件
  2. 我需要如何更改它们?

1 个答案:

答案 0 :(得分:0)

Java Home

您的JAVA_HOME应设置为您的JDK
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64/jre
应该是 export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64

以下是JAVA_HOME上的Oracle文档(也应该适用于Open JDK) https://docs.oracle.com/cd/E19182-01/820-7851/inst_cli_jdk_javahome_t/

Spark环境变量

还应在JAVA_HOME中设置$SPARK_HOME/conf/spark-env.sh https://spark.apache.org/docs/latest/configuration.html#environment-variables