Spark-Scala代码在Spark-Shell中运行完成,但通过Spark-Submit无限运行

时间:2019-09-10 20:25:42

标签: scala apache-spark

我有一段Spark-Scala代码,可以在Spark-Shell中按预期方式运行,但是当通过spark-submit启动时(如下所示),执行程序将继续失败,退出代码为1,并且该作业无限运行。

./spark-submit --name Loader --class SP.Loader --master spark://spark-master.default.svc.cluster.local:7077 --executor-memory 64G --packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1 --conf spark.cores.max=280 --conf spark.cassandra.connection.host=cassandra.default.svc.cluster.local --driver-memory 32G --conf spark.driver.maxResultSize=16G /notebooks/Drago/scala/target/scala-2.11/loader_2.11-1.0.jar
19/09/10 20:16:15 INFO BlockManagerMaster: Removal of executor 11 requested
19/09/10 20:16:15 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asked to remove non-existent executor 11
19/09/10 20:16:15 INFO BlockManagerMasterEndpoint: Trying to remove executor 11 from BlockManagerMaster.
19/09/10 20:16:15 INFO StandaloneSchedulerBackend: Granted executor ID app-20190910201611-0039/20 on hostPort 10.42.4.135:39575 with 35 core(s), 64.0 GB RAM
19/09/10 20:16:15 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190910201611-0039/20 is now RUNNING
19/09/10 20:16:15 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190910201611-0039/15 is now EXITED (Command exited with code 1)
19/09/10 20:16:15 INFO StandaloneSchedulerBackend: Executor app-20190910201611-0039/15 removed: Command exited with code 1
19/09/10 20:16:15 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20190910201611-0039/21 on worker-20190830191329-10.42.6.133-34823 (10.42.6.133:34823) with 35 core(s)

0 个答案:

没有答案