Spark standalone not working

时间:2016-09-01 06:17:12

标签: apache-spark

I am getting an error in spark standalone. Whenever I am trying to start

./start-master.sh 

I am getting below error -

 spark-2.0.0-bin-hadoop2.7/bin/spark-class: line 93: [: too many arguments

I have tried to debug spark-class file as well. Code is getting error at this line of spark-class.sh :

if [ $LAUNCHER_EXIT_CODE != 0 ]; then
  exit $LAUNCHER_EXIT_CODE
fi 

^Could be an issue?

2 个答案:

答案 0 :(得分:0)

似乎这是一个错误,并且已经修复了版本> 2.0.0

我能够重现这个问题并且更新到早期版本解决了这个问题。

有关https://issues.apache.org/jira/browse/SPARK-16586

的更多信息

答案 1 :(得分:0)

您需要手动运行master和worker,如下所示:

1 - 转到C:\ spark \ bin文件夹

2-run spark-class org.apache.spark.deploy.master.Master(它将为您提供一个带有此表单的URL:spark:// ip:port)

3-run spark-class org.apache.spark.deploy.worker.Worker spark:// ip:port

4-run spark-shell --master spark:// ip:port将应用程序连接到新创建的集群。