即使在流程成功完成后,spark-submit也不会退出

时间:2018-04-18 19:29:53

标签: scala apache-spark spark-dataframe

我编写了一个简单的应用程序并且运行正常但是当我通过spark-submit提交代码时,即使调用close()后,spark-submit会话也没有完成,我需要终止PID。

以下是代码段

object FaultApp {

  case class Person(name: String, age: Long)

  def main(args: Array[String]):Unit = {  

val spark = SparkSession
  .builder
  .enableHiveSupport()
  .config("spark.scheduler.mode", "FAIR")
  .appName("parjobs")
  .getOrCreate()

import spark.implicits._

val pool = Executors.newFixedThreadPool(5)
 // create the implicit ExecutionContext based on our thread pool
implicit val xc = ExecutionContext.fromExecutorService(pool)

import Function._

val caseClass = Seq(Person("X", 32)
                      ,Person("Y", 37)
                      ,Person("Z", 37)
                      ,Person("A", 6)
                   )


val caseClassDS = caseClass.toDF()

val taskA = write_first(caseClassDS)

Await.result(Future.sequence(Seq(taskA)), Duration(1, MINUTES))

spark.stop()

println("After Spark Stop command")  
  }

}

object Function {
  def write_first (ds : DataFrame)(implicit xc: ExecutionContext)  = Future {
   Thread.sleep(10000)
   ds.write.format("orc").mode("overwrite")
     .option("compression", "zlib")
     .saveAsTable("save_1")
  }
}

我使用以下命令提交作业

spark-submit --master yarn --deploy-mode client fault_application-assembly-1.0-SNAPSHOT.jar --executor-memory 1G --executor-cores 2 --driver-memory 1G

以下是日志中的最后几行

18/04/18 15:15:20 INFO SchedulerExtensionServices: Stopping 
SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
18/04/18 15:15:20 INFO YarnClientSchedulerBackend: Stopped
18/04/18 15:15:20 INFO MapOutputTrackerMasterEndpoint: 
  MapOutputTrackerMasterEndpoint stopped!
18/04/18 15:15:20 INFO MemoryStore: MemoryStore cleared
18/04/18 15:15:20 INFO BlockManager: BlockManager stopped
18/04/18 15:15:20 INFO BlockManagerMaster: BlockManagerMaster stopped
18/04/18 15:15:20 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/04/18 15:15:20 INFO SparkContext: Successfully stopped SparkContext
After Spark Stop command

非常感谢任何帮助或建议。

1 个答案:

答案 0 :(得分:1)

那是因为你正在用线程池创建一个执行上下文,所以你的程序在关闭之前不会关闭。

this.name = this.value ? 'test-changed' : 'test'; 后,添加

spark.stop()

另一方面,您可以只使用全局的:

,而不是为您的未来创建新的执行上下文
xc.shutdown()
println("After shutdown.")