如何在airflow中使用--conf选项

时间:2017-08-29 18:12:44

标签: airflow apache-airflow airflow-scheduler

我正在尝试运行气流DAG并需要为任务传递一些参数。

如何在python DAG文件中读取命令行trigger_dag命令中作为--conf参数传递的JSON字符串。

ex:airflow trigger_dag 'dag_name' -r 'run_id' --conf '{"key":"value"}'

2 个答案:

答案 0 :(得分:10)

两种方式。从模板字段或文件中:

{{ dag_run.conf['key'] }}

或者当上下文可用时,例如在PythonOperator

的python中
context['dag_run'].conf['key']

答案 1 :(得分:1)

在此处提供的示例https://github.com/apache/airflow/blob/master/airflow/example_dags/example_trigger_target_dag.py#L62中,当尝试解析在气流REST API调用中传递的“ conf”时,请在pythonOperator中使用provide_context=True

此外,REST API调用中以json格式传递的键值对也可以在'\'{{ dag_run.conf["key"] if dag_run else "" }}\''的bashOperator和sparkOperator中进行访问

dag = DAG(
    dag_id="example_dag",
    default_args={"start_date": days_ago(2), "owner": "airflow"},
    schedule_interval=None
)

def run_this_func(**context):
    """
    Print the payload "message" passed to the DagRun conf attribute.
    :param context: The execution context
    :type context: dict
    """
    print("context", context)
    print("Remotely received value of {} for key=message".format(context["dag_run"].conf["key"]))

#PythonOperator usage
run_this = PythonOperator(task_id="run_this", python_callable=run_this_func, dag=dag, provide_context=True)

#BashOperator usage
bash_task = BashOperator(
    task_id="bash_task",
    bash_command='echo "Here is the message: \'{{ dag_run.conf["key"] if dag_run else "" }}\'"',
    dag=dag
)

#SparkSubmitOperator usage
spark_task = SparkSubmitOperator(
        task_id="task_id",
        conn_id=spark_conn_id,
        name="task_name",
        application="example.py",
        application_args=[
            '--key', '\'{{ dag_run.conf["key"] if dag_run else "" }}\''
        ],
        num_executors=10,
        executor_cores=5,
        executor_memory='30G',
        #driver_memory='2G',
        conf={'spark.yarn.maxAppAttempts': 1},
        dag=dag)