在Airflow中捕获AirflowSkipException以进行S3KeySensor调用

时间:2019-03-27 05:45:39

标签: python-2.7 airflow

当soft_fail设置为True时,调用S3KeySensor会将自身和后续任务标记为已跳过。 但是,我需要捕获此异常,即AirflowSkipException 并根据此异常执行一些操作。我的问题是使用try and catch块不会捕获任何异常,因此我无法执行依赖于它的下游任务。

使用下面的Airflow代码文档找出异常类型:https://airflow.readthedocs.io/en/stable/_modules/airflow/sensors/base_sensor_operator.html

我试图捕捉到的异常如下:

try:
    s3_file_watcher_task_operator = S3KeySensor(
        task_id='s3_file_watcher_task',
        bucket_key="{{ task_instance.xcom_pull(key='s3_poll_config', task_ids='dag_start_config_setup_task')['source_extract_regex'] }}",
        poke_interval=float(Variable.get("S3_POKE_INTERVAL")),  # Checking interval in seconds
        timeout=float(Variable.get("S3_TIMEOUT")),   # Total seconds for Timeout
        soft_fail=True,   #Ensure task status becomes 'skipped' in case of failure. Subsequent task will handle soft failures.
        wildcard_match=True,
        bucket_name="{{ task_instance.xcom_pull(key='s3_poll_config', task_ids='dag_start_config_setup_task')['landing_bucket'] }}",
        aws_conn_id="{{ task_instance.xcom_pull(key='s3_poll_config', task_ids='dag_start_config_setup_task')['aws_conn_id'] }}",
        dag=dag)
except AirflowSkipException as e:
    closetaskinstance(job_exec_config, task_execution_id[0], 'FAILED', str(datetime.now()))
    closedaginstance(job_exec_config, dag_execution_id, 'FAILED', str(datetime.now()))
    raise

0 个答案:

没有答案