为什么start-all.sh从root导致“无法启动org.apache.spark.deploy.master.Master:JAVA_HOME未设置”?

时间:2015-11-27 10:43:59

标签: java scala apache-spark cloudera

我正在尝试通过在cloudera quickstart VM 5.3.0上运行的独立Spark服务来执行通过Scala IDE构建的Spark应用程序。

我的cloudera帐户JAVA_HOME是/ usr / java / default

但是,我在执行来自cloudera用户的start-all.sh命令时遇到以下错误消息,如下所示:

[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password: 
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out' for reading: No such file or directory
localhost: full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out

我在export CMF_AGENT_JAVA_HOME=/usr/java/default添加了/etc/default/cloudera-scm-agent并运行sudo service cloudera-scm-agent restart。见How to set CMF_AGENT_JAVA_HOME

我还在文件export JAVA_HOME=/usr/java/default的{​​{1}}函数定义中添加了locate_java_home,并重新启动了集群和独立的Spark服务

但是从/usr/share/cmf/bin/cmf-server用户

启动spark服务时,会出现以下错误
root

有人可以建议如何设置JAVA_HOME以便在cloudera管理器上启动Spark独立服务吗?

2 个答案:

答案 0 :(得分:5)

解决方案非常简单明了。刚刚在export JAVA_HOME=/usr/java/default中添加了/root/.bashrc,它成功启动了来自root用户的spark服务,但没有JAVA_HOME is not set错误。希望它能帮助遇到同样问题的人。

答案 1 :(得分:0)

按如下所示在~/.bashrc中设置JAVA_HOME变量

sudo gedit ~/.bashrc

将此行写入文件(已安装的JDK的地址)

JAVA_HOME="/usr/lib/jvm/java-11-openjdk-amd64"

然后命令

source ~/.bashrc