我想执行任务2,如果任务1成功,如果任务1失败,我想运行任务3,并希望在需要时分配另一个流程。
基本上我想在没有ssh运算符的情况下在气流中运行条件任务。
from airflow import DAG
from airflow.operators import PythonOperator,BranchPythonOperator
from airflow.operators import BashOperator
from datetime import datetime, timedelta
from airflow.models import Variable
def t2_error_task(context):
instance = context['task_instance']
if instance.task_id == "performExtract":
print ("Please implement something over this")
task_3 = PythonOperator(
task_id='performJoin1',
python_callable=performJoin1, # maybe main?
dag = dag
)
dag.add_task(task_3)
with DAG(
'manageWorkFlow',
catchup=False,
default_args={
'owner': 'Mannu',
'start_date': datetime(2018, 4, 13),
'schedule_interval':None,
'depends_on_past': False,
},
) as dag:
task_1 = PythonOperator(
task_id='performExtract',
python_callable=performExtract,
on_failure_callback=t2_error_task,
depends_on_past=True
)
task_2 = PythonOperator(
task_id='printSchemas',
depends_on_past=True,
python_callable=printSchemaAll, # maybe main?
)
task_2.set_upstream(task_1)
答案 0 :(得分:4)
基于执行时状态动态添加任务不是Airflow支持的。为了获得所需的行为,您应该将task_3
添加到您的dag,但将其trigger_rule
更改为all_failed
。在这种情况下,当task_1
成功时,任务将被标记为跳过,但是当它失败时它将被执行。