Apache Spark Shell无法启动Executor,在群集模式下获得EXITED

时间:2016-07-21 08:31:31

标签: apache-spark pyspark

我正在尝试在群集上启动Spark-Shell并获取如下错误 -

16/07/21 11:27:28 INFO client.AppClient$ClientEndpoint: Executor updated: app-20160721112151-0000/179 is now RUNNING
16/07/21 11:27:33 INFO client.AppClient$ClientEndpoint: Executor updated: app-20160721112151-0000/177 is now EXITED (Command exited with code 1)
16/07/21 11:27:33 INFO cluster.SparkDeploySchedulerBackend: Executor app-20160721112151-0000/177 removed: Command exited with code 1
16/07/21 11:27:33 INFO cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 177
16/07/21 11:27:33 INFO client.AppClient$ClientEndpoint: Executor added: app-20160721112151-0000/180 on worker-20160721112059-10.65.104.9-43892 (10.65.104.9:43892) with 8 cores
16/07/21 11:27:33 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160721112151-0000/180 on hostPort 10.65.104.9:43892 with 8 cores, 4.0 GB RAM
16/07/21 11:27:33 INFO client.AppClient$ClientEndpoint: Executor updated: app-20160721112151-0000/180 is now RUNNING
16/07/21 11:27:34 INFO client.AppClient$ClientEndpoint: Executor updated: app-20160721112151-0000/178 is now EXITED (Command exited with code 1)
16/07/21 11:27:34 INFO cluster.SparkDeploySchedulerBackend: Executor app-20160721112151-0000/178 removed: Command exited with code 1
16/07/21 11:27:34 INFO cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 178
16/07/21 11:27:34 INFO client.AppClient$ClientEndpoint: Executor added: app-20160721112151-0000/181 on worker-201607211

12059-10.65.105.6-37622 (10.65.105.6:37622) with 8 cores

我也尝试提交工作,结果保持不变。 我检查了我的奴隶文件并检查了所有配置但是找不到任何错误。

尝试将shell作为本地启动时没有发生同样的事情,所以我认为它可以与主机相关所以我检查了/ etc / host -

127.0.0.1       localhost
127.0.1.1       theubuntu
10.65.104.9     Mainserver
10.65.105.6     Client1
10.65.104.16    Client2
10.65.104.14    Client3

任何建议

1 个答案:

答案 0 :(得分:0)

错误发生在我的Spark-env.sh中,我提供了

export SPARK_JAVA_OPTS=-Dspark.driver.port=53411

但是该端口不可用且Worker在SPark_Home \ work \ Application_Id中登录,正确记录了我没看到的错误。