SparkContext
有getExecutorMemoryStatus
方法。但它是Executor的内存状态。
有没有办法获得core
身份?我使用Spark Standalone Cluster。
答案 0 :(得分:1)
选项2 :默认值:
sc.defaultParallelism
通常设置为群集中的工作核心数
选项3 :可以使用下面的ExectorInfo.totalCores并尝试...它应该有效。
docs说
public class ExecutorInfo extends Object
存储有关的信息 执行程序从调度程序传递到SparkListeners。
import org.apache.spark.scheduler.{SparkListener, SparkListenerExecutorAdded}
/**
* Logs info of added executors.
*/
final class ExecutorLogger extends SparkListener {
override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit =
println(s"\rExecutor ${executorAdded.executorId} added: ${executorAdded.executorInfo.executorHost} ${executorAdded.executorInfo.totalCores} cores")
}