尝试执行spark-shell时出现错误“无此文件或目录”

时间:2019-03-26 06:18:39

标签: scala apache-spark

我已经更新了jdk,现在当我尝试输入spark shell时,出现以下错误:

/home/orienit/work/spark-1.6.0-bin-hadoop2.6/bin/spark-class: line 86:  
 /usr/lib/jvm/java-1.8.0-openjdk/bin/java: No such file or directory

1 个答案:

答案 0 :(得分:0)

为了确保spark指向正确的Java版本: 1)转到{spark-dir}/conf 2)cp spark-env.sh.template spark-env.sh。 3)编辑“ spark-env.sh”并添加JAVA_HOME={JAVA_HOME_DIR} 4)执行-./spark-env.sh