尝试使用Spark shell和Spark提交,获得此异常?
Initializing SparkContext with MASTER: spark://1.2.3.4:7077
ERROR 2015-06-11 14:08:29 org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
WARN 2015-06-11 14:08:29 org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend: Application ID is not initialized yet.
ERROR 2015-06-11 14:08:30 org.apache.spark.scheduler.TaskSchedulerImpl: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.
答案 0 :(得分:1)
确保主服务器的URL正确,并且主服务器仍处于活动状态。
您可以通过浏览器中的spark web UI查看正确的URL。如果您在本地运行主服务器,请尝试在任何浏览器窗口中键入localhost:8080。
以下是有关网络用户界面https://spark.apache.org/docs/1.2.0/monitoring.html
的更多信息