更改Airflow DAG的执行并发性

时间:2017-01-15 13:24:52

标签: python concurrency airflow

我想更改特定Airflow DAG的dag_concurrency参数。 dag_concurrency中似乎有一个全局airflow.cfg参数,但是可以为不同的DAG设置不同的值吗?

我尝试在SSHExecuteOperator任务中的DAG代码中添加并发参数,但并发值仍显示DAG详细信息中的标准参数(16)。

from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.hooks.ssh_hook import SSHHook
from airflow.contrib.operators.ssh_execute_operator import SSHExecuteOperator

default_args = {
  'owner': 'airflow',
  'depends_on_past': False,
  'start_date': datetime.now(),
  'email': ['exceptions@airflow.com'],
  'email_on_failure': True,
  'retries': 0
}

#server must be changed to point to the correct environment
sshHookEtl = SSHHook(conn_id='SSH__airflow@myserver')

with DAG(
  'ed_data_quality_20min-v1.6.6',
  default_args=default_args,
  schedule_interval="0,20,40 * * * *",
  dagrun_timeout=timedelta(hours=24)) as dag:
  (
    dag
    >> SSHExecuteOperator(
          task_id='run_remote_ed_data_quality_20min',
          bash_command='bash /opt/scripts/shell/EXEC_ED_DATA_QUALITY_20MIN.sh ',
          ssh_hook=sshHookEtl,
          retries=0,
          concurrency=1,
          dag=dag)
  )

Here is the DAG details

2 个答案:

答案 0 :(得分:3)

我找到了解决方案。我没有在正确的位置添加并发参数。它应该作为DAG对象的属性直接添加,而不是添加到任务SSHExecuteOperator中。这是新代码:

from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.hooks.ssh_hook import SSHHook
from airflow.contrib.operators.ssh_execute_operator import SSHExecuteOperator

default_args = {
  'owner': 'airflow',
  'depends_on_past': False,
  'start_date': datetime.now(),
  'email': ['exceptions@airflow.com'],
  'email_on_failure': True,
  'retries': 0
}

#server must be changed to point to the correct environment
sshHookEtl = SSHHook(conn_id='SSH__airflow@myserver')

with DAG(
  'ed_data_quality_20min-v1.6.6',
  default_args=default_args,
  schedule_interval="0,20,40 * * * *",
  dagrun_timeout=timedelta(hours=24),
  concurrency=1) as dag:
  (
    dag
    >> SSHExecuteOperator(
          task_id='run_remote_ed_data_quality_20min',
          bash_command='bash /opt/scripts/shell/EXEC_ED_DATA_QUALITY_20MIN.sh ',
          ssh_hook=sshHookEtl,
          retries=0,
          dag=dag)
  )

答案 1 :(得分:1)

...好 您可以just set concurrency对象上的DAGtask_concurrency对象上还有一个BaseOperatorconcurrencySSHExectorOperator任务中没有BaseOperator param也是字段。