按顺序重新运行任务(A)3次的最佳方法是什么?:
即任务A->任务A->任务A->任务B
我问是因为我将运行另一个单独的数据验证任务(B),该任务将比较这3次单独运行的数据。
所以这是我到目前为止所做的:
dag = DAG("hello_world_0", description="Starting tutorial", schedule_interval='* * * * *',
start_date=datetime(2019, 1, 1),
catchup=False)
data_pull_1 = BashOperator(task_id='attempt_1', bash_command='echo "Hello World - 1!"',dag=dag)
data_pull_2 = BashOperator(task_id='attempt_2', bash_command='echo "Hello World - 2!"',dag=dag)
data_pull_3 = BashOperator(task_id='attempt_3', bash_command='echo "Hello World - 3!"',dag=dag)
data_validation = BashOperator(task_id='data_validation', bash_command='echo "Data Validation!"',dag=dag)
data_pull_1 >> data_pull_2 >> data_pull_3 >> data_validation
这可能有用,但是还有更优雅的方法吗?
答案 0 :(得分:1)
您可以尝试以下实现,我们使用for循环创建3个操作
from datetime import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
dag = DAG(
"hello_world_0",
description="Starting tutorial",
schedule_interval=None,
start_date=datetime(2019, 1, 1),
catchup=False
)
chain_operators = []
max_attempt = 3
for attempt in range(max_attempt):
data_pull = BashOperator(
task_id='attempt_{}'.format(attempt),
bash_command='echo "Hello World - {}!"'.format(attempt),
dag=dag
)
chain_operators.append(data_pull)
data_validation = BashOperator(task_id='data_validation', bash_command='echo "Data Validation!"', dag=dag)
chain_operators.append(data_validation)
# Add downstream
for i,val in enumerate(chain_operators[:-1]):
val.set_downstream(chain_operators[i+1])
我将schedule_interval更改为“无”,因为使用'* * * * *'
时,作业将被连续触发