无法在ubuntu中启动Spark-standalone

时间:2016-07-13 07:59:14

标签: apache-spark bigdata

我在本地计算机上启动spark standalone时遇到了麻烦。我正在关注this手册,这非常简单,但我仍然遗漏了一些东西

当我运行start-master.sh时,它会显示以下错误:

couto@ubuntu:~/Downloads/spark-1.6.2-bin-hadoop2.4/sbin$ sh start-master.sh 
start-master.sh: 31: start-master.sh: [[: not found
start-master.sh: 31: start-master.sh: [[: not found
start-master.sh: 45: start-master.sh: 0: not found
start-master.sh: 52: /home/couto/Downloads/spark-1.6.2-bin-hadoop2.4/bin/load-spark-env.sh: [[: not found
starting org.apache.spark.deploy.master.Master, logging to /home/couto/Downloads/spark-1.6.2-bin-hadoop2.4/logs/spark-couto-org.apache.spark.deploy.master.Master-1-ubuntu.out
start-master.sh: 78: [: false: unexpected operator

尽管如此,主人似乎也是正确的:

enter image description here

但是,一旦我尝试启动从站,它会显示以下错误,并且火花控制台中不包含任何从站。

couto@ubuntu:~/Downloads/spark-1.6.2-bin-hadoop2.4/sbin$ sh start-slave.sh spark://localhost:7077
start-slave.sh: 42: start-slave.sh: [[: not found
start-slave.sh: 42: start-slave.sh: [[: not found
start-slave.sh: 42: start-slave.sh: [[: not found
start-slave.sh: 52: /home/couto/Downloads/spark-1.6.2-bin-hadoop2.4/bin/load-spark-env.sh: [[: not found
start-slave.sh: 68: start-slave.sh: function: not found
start-slave.sh: 70: shift: can't shift that many

我正在使用jdk 1.8.0_91 Spark:spark-1.6.2-bin-hadoop2.4(我也试过hadoop2.6的预建版本)

使用第一条评论中建议的start-all.sh:

couto@ubuntu:~/Downloads/spark-1.6.2-bin-hadoop2.4/sbin$ sh start-all.sh 
start-all.sh: 30: start-all.sh: 0: not found
starting org.apache.spark.deploy.master.Master, logging to /home/couto/Downloads/spark-1.6.2-bin-hadoop2.4/logs/spark-couto-org.apache.spark.deploy.master.Master-1-ubuntu.out
localhost: ssh: connect to host localhost port 22: Connection refused
你能告诉我一些迹象吗?

3 个答案:

答案 0 :(得分:1)

运行如下,工作正常

bash start-master.sh 
bash start-slave.sh spark://localhost:7077

亲切的问候

答案 1 :(得分:0)

Spark提供的所有脚本都是Bash脚本,并使用一些特定于Bash的语法。您正在尝试使用sh解释器运行它,这会导致错误。你应该直接执行它们。

chmod +x start-all.sh
./start-all.sh

答案 2 :(得分:0)

.bashrc中添加sbin路径

export SPARK_HOME=/usr/local/spark
export PATH=$PATH:$SPARK_HOME/bin
export PATH=$PATH:$SPARK_HOME/sbin

然后运行.bashrc

source `.bashrc`