如何知道Spark应用程序的执行程序IP?

时间:2016-09-29 08:28:29

标签: apache-spark

我想在运行时获取执行程序的所有IP,我应该使用Spark中的哪个API?或者在运行时获取IP的任何其他方法?

2 个答案:

答案 0 :(得分:1)

Apache Spark中有一个类,即ExecutorInfo,它有一个返回Executor主机IP的方法executorHost()。

答案 1 :(得分:1)

您应该使用SparkListener抽象类并拦截两个特定于执行程序的事件 - SparkListenerExecutorAddedSparkListenerExecutorRemoved

override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit = {
  val execId = executorAdded.executorId
  val host = executorAdded.executorInfo.executorHost
  executors += (execId -> host)
  println(s">>> executor id=$execId added on host=$host")
}

override def onExecutorRemoved(executorRemoved: SparkListenerExecutorRemoved): Unit = {
  val execId = executorRemoved.executorId
  val host = executors remove execId getOrElse "Host unknown"
  println(s">>> executor id=$execId removed from host=$host")
}

整个工作项目都在我的Spark Executor Monitor Project