如何在独立集群上使用作业名称杀死Spark作业

时间:2019-02-04 21:47:15

标签: apache-spark sandbox hdp

如何在独立集群上取消具有作业名称的Spark作业?如何在沙箱上列出Spark作业ID?是否有任何类似于yarn application -list的命令?

1 个答案:

答案 0 :(得分:1)

variable=$1
jar=$2
ps -ef | grep -w ${variable} | grep -w 'org.apache.spark.deploy.SparkSubmit' | grep -w ${jar}>t.txt
 sed -n 1p t.txt>t1.txt
 awk '{print $3}' t1.txt >kill.txt
 while read x;do
  kill -9 $x
  echo Process Id $x and Application "$variable" killed
done <kill.txt