在一台机器上启动主服务器和工作服务器后...
spark-class org.apache.spark.deploy.master.Master -i 127.0.0.1 -p 7070
spark-class org.apache.spark.deploy.worker.Worker 127.0.0.1:7070
并提交以下Spark作业...
spark-submit --class Main --master spark://127.0.0.1:7070 --deploy-mode client /path/to/app.jar
应用程序已成功执行,但由于某种原因,执行器被强制终止:
19/05/10 09:28:31 INFO Worker: Asked to kill executor app-20190510092810-0000/0
19/05/10 09:28:31 INFO ExecutorRunner: Runner thread for executor app-20190510092810-0000/0 interrupted
19/05/10 09:28:31 INFO ExecutorRunner: Killing process!
19/05/10 09:28:31 INFO Worker: Executor app-20190510092810-0000/0 finished with state KILLED exitStatus 1
这是正常行为吗?如果没有,如何防止这种情况发生?
我正在使用Spark 2.4.0。