SparkAppHandle指出Kubernetes中没有更新

时间:2019-04-16 07:40:39

标签: apache-spark kubernetes spark-launcher

通过SparkLauncher()启动Spark应用程序时,SparkAppHandle状态没有得到更新。

sparkLaunch = new SparkLauncher()
.setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7")
.setMaster("k8s://https://172.16.23.30:6443")
.setVerbose(true)
.addSparkArg("--verbose")
.setAppResource("local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar")
.setConf("spark.app.name","spark-pi")
.setMainClass("org.apache.spark.examples.SparkPi")
.setConf("spark.executor.instances","5")
.setConf("spark.kubernetes.container.image","registry.renovite.com/spark:v2")
.setConf("spark.kubernetes.driver.pod.name","spark-pi-driver")
.setConf("spark.kubernetes.container.image.pullSecrets","dev-registry-key")
.setConf("spark.kubernetes.authenticate.driver.serviceAccountName","spark")
.setDeployMode("cluster")
;

SparkAppHandle handle = sparkLaunch.startApplication();

Observations:

Now, I tried listeners etc but handle.getState() returns UNKNOWN and when Spark application is completed. state changes to LOST.
SparkAppHandle is not null
handle.getAppId() is always null.
My best guess is that communication is not working properly between listener and Spark driver in Kubernetes.

0 个答案:

没有答案