如何在独立集群上取消具有作业名称的Spark作业?如何在沙箱上列出Spark作业ID?是否有任何类似于yarn application -list
的命令?
答案 0 :(得分:1)
variable=$1
jar=$2
ps -ef | grep -w ${variable} | grep -w 'org.apache.spark.deploy.SparkSubmit' | grep -w ${jar}>t.txt
sed -n 1p t.txt>t1.txt
awk '{print $3}' t1.txt >kill.txt
while read x;do
kill -9 $x
echo Process Id $x and Application "$variable" killed
done <kill.txt