我正在尝试在Ubuntu 14.XXX spark-2.4.0-bin-hadoop2.6中设置Spark独立集群。我在bashrc文件中设置了以下内容:
export SPARK_HOME=/home/xxxx/spark
export PATH=$SPARK_HOME/bin:$PATH
并修改了conf/spark-env .sh
文件中的更改
在spark中启动所有命令将为我们提供以下信息:
start-all.sh: line 29: /home/xxxx/spark/sbin/spark-config.sh: No such file or directory
./start-all.sh: line 32: /home/xxxx/spark/sbin/start-master.sh: No such file or directory
./start-all.sh: line 35: /home/xxxxxx/spark/sbin/start-slaves.sh: No such file or directory