如何以编程方式从ps -ef杀死spark工作

时间:2019-06-20 13:15:32

标签: apache-spark yarn spark-submit

我想以编程方式杀死火花工作。以下是场景:

当我使用ps -ef | grep <app_name>取消火花作业时,它被杀死,但是如果我执行ps -ef,则是火花作业条目。我如何确保它也被yarn application -kill杀死?

我想以编程方式执行此操作,因为我正在通过代码执行npm i mapwize -s

对此有任何帮助。

谢谢。

1 个答案:

答案 0 :(得分:0)

您将要使用'ps -ef | grep SparkSubmit'查找任何异常的Spark作业。然后在其PID ID上使用kill -9来杀死它们。不要杀死主要的火花工作!

//Find all the java jobs
[stack_overflow@stack_overflow ~]$ ps -ef | grep SparkSubmit
stack_overflow  96747  96736 99 11:19 pts/15   00:01:55 /usr/bin/java -cp /opt/spark/conf/:/opt/spark/jars/* -Dscala.usejavacp=true -Xmx1g -Dderby.system.home=/home/stack_overflow/Spark/ org.apache.spark.deploy.SparkSubmit --conf spark.local.dir=/opt/spark/temp_land/spark-temp --conf spark.driver.extraJavaOptions=-Dderby.system.home=/home/stack_overflow/ --class org.apache.spark.repl.Main --name Spark shell spark-shell
stack_overflow  97410  14952  0 11:20 pts/15   00:00:00 grep --color=auto SparkSubmit
//96747 is the Spark job I forced to become unresponsive
//97410 is the Base Spark Account don't delete
////Run the kill command on the job, only works if you have permissions on that job
[stack_overflow@stack_overflow ~]$ kill -9 96747
//The job is now dead and gone
[stack_overflow@stack_overflow ~]$ ps -ef | grep SparkSubmit
stack_overflow  96190  14952  0 11:17 pts/15   00:00:00 grep --color=auto SparkSubmit