在气流为1.10的情况下,当我将conf分配给TriggerDagRunOperator时
dag = DAG(
dag_id='all_dist_cp',
default_args=args,
dagrun_timeout=timedelta(minutes=60),
)
distcp_1 = TriggerDagRunOperator(
task_id="distcp_1",
trigger_dag_id="dist_cp",
conf={
"SERVICE_ID": "A",
"SOURCE":"...",
"DESTINATION":"..."
},
dag=dag
)
distcp_2 = TriggerDagRunOperator(
task_id="distcp_2",
trigger_dag_id="dist_cp",
conf={
"SERVICE_ID": "B",
"SOURCE":"...",
"DESTINATION":"..."
},
dag=dag
)
它抱怨我不能将conf设置为TriggerDagRunOperator。那么,为什么TriggerDagRunOperator具有conf参数?
dagrun_operator.py:65: PendingDeprecationWarning: Invalid arguments were passed to TriggerDagRunOperator (task_id: distcp_1). Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:
[2019-11-04 17:07:59,083] {base_task_runner.py:115} INFO - Job 70: Subtask distcp_1 *args: ()
[2019-11-04 17:07:59,084] {base_task_runner.py:115} INFO - Job 70: Subtask distcp_1 **kwargs: {'conf': {'SERVICE_ID': 'A', 'SOURCE': '', 'DESTINATION': ''}}
[2019-11-04 17:07:59,084] {base_task_runner.py:115} INFO - Job 70: Subtask distcp_1 super(TriggerDagRunOperator, self).__init__(*args, **kwargs)
如何将参数传递给distcp dag? 我有一些命令必须转移到distcp DAG
distcp dag看起来
args = {
'owner': 'Airflow',
'start_date': airflow.utils.dates.days_ago(2),
}
dag = DAG(
dag_id='distcp',
default_args=args,
schedule_interval='0 0 * * *',
dagrun_timeout=timedelta(minutes=60),
)
dist_cp = BashOperator(
task_id='dist_cp',
bash_command="""
SERVICE_ID={{dag_run.conf['SERVICE_ID']}}
hadoop distcp \
-Dmapreduce.job.queuename=small \
....
""",
dag=dag,
)
dist_cp