Apache Spark中是否重新提交失败的任务?

时间:2014-10-08 14:53:41

标签: apache-spark

Apache Spark中的失败任务会自动重新提交给同一个还是另一个执行者?

2 个答案:

答案 0 :(得分:10)

我相信失败的任务会重新提交,因为我在Web UI上看到过多次提交相同的失败任务。但是,如果同一任务多次失败,则完整作业将失败:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 120 in stage 91.0 failed 4 times, most recent failure: Lost task 120.3 in stage 91.0

答案 1 :(得分:6)

是的,但是有最大失败次数的参数设置

spark.task.maxFailures  4   Number of individual task failures before giving up on the job. Should be greater than or equal to 1. Number of allowed retries = this value - 1.