气流-试图循环操作员。执行不等待实际操作完成

时间:2018-12-19 22:48:03

标签: airflow google-cloud-composer

在气流中-我正在尝试循环操作员。 (BigQueryOperator)。 DAG甚至在查询完成之前就完成了。

我的DAG本质上是:

  1. 逐个读取一组插入查询。
  2. 使用BigQueryOperator触发每个查询。

当我试图写2条记录(带有2条insert语句)时,我只能看到1条记录。

dag
bteqQueries = ReadFile() --Read GCP bucket file and get the list of SQL queries (as text) separated by new line

for currQuery in bteqQueries.split('\n'):
    #logging.info("currQuery : {}".format(currQuery))
     parameter = {
    'cur_query': currQuery
}
    logging.info("START $$ : {}".format(parameter.get('cur_query')))
    gcs2BQ = BigQueryOperator(
    task_id='gcs2bq_insert',
    bql=parameter.get('cur_query'),
    write_disposition="WRITE_APPEND",
    bigquery_conn_id='bigquery_default',
    use_legacy_sql='False',
    dag=dag,
    task_concurrency=1)
    logging.info("END $$ : {}".format(parameter.get('cur_query')))


gcs2BQ

期望输入文件(在GCS存储桶中)中要执行的所有查询。我有几个插入查询,并期望最终的bigquery表中有2条记录。但是我只看到1条记录。

********下面是日志******

 2018-12-19 03:57:16,194] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,190] {gcs2BQ_bteq.py:59} INFO - START $$ : insert into `gproject.bucket.employee_test_stg.employee_test_stg` (emp_id,emp_name,edh_end_dttm) values (2,"srikanth","2099-01-01") ; 
[2018-12-19 03:57:16,205] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,201] {models.py:2190} WARNING - schedule_interval is used for <Task(BigQueryOperator): gcs2bq_insert>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2018-12-19 03:57:16,210] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,209] {gcs2BQ_bteq.py:68} INFO - END $$ : insert into `project.bucket.employee_test_stgemployee_test_stg` (emp_id,emp_name,edh_end_dttm) values (2,"srikanth","2099-01-01") ; 
[2018-12-19 03:57:16,213] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,213] {gcs2BQ_bteq.py:59} INFO - START $$ : insert into `project.bucket.employee_test_stg` (emp_id,emp_name,edh_end_dttm) values (3,"srikanth","2099-01-01") ;
[2018-12-19 03:57:16,223] {base_task_runner.py:98} INFO - Subtask: 
[2018-12-19 03:57:16,218] {models.py:2190} WARNING - schedule_interval is used for <Task(BigQueryOperator): gcs2bq_insert>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2018-12-19 03:57:16,230] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,230] {gcs2BQ_bteq.py:68} INFO - END $$ : insert into `dataset1.adp_etl_stg.employee_test_stg` (emp_id,emp_name,edh_end_dttm) values (3,"srikanth","2099-01-01") ;
[2018-12-19 03:57:16,658] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,655] {bigquery_operator.py:90} INFO - Executing: insert into `dataset1.adp_etl_stg.employee_test_stg` (emp_id,emp_name,edh_end_dttm) values (2,"srikanth","2099-01-01") ; 
[2018-12-19 03:57:16,703] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,702] {gcp_api_base_hook.py:74} INFO - Getting connection using `gcloud auth` user, since no key file is defined for hook.
[2018-12-19 03:57:16,848] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,847] {discovery.py:267} INFO - URL being requested: GET https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
[2018-12-19 03:57:16,849] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:16,849] {client.py:595} INFO - Attempting refresh to obtain initial access_token
[2018-12-19 03:57:17,012] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:17,011] {discovery.py:852} INFO - URL being requested: POST https://www.googleapis.com/bigquery/v2/projects/gcp-***Project***/jobs?alt=json
[2018-12-19 03:57:17,214] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:17,214] {discovery.py:852} INFO - URL being requested: GET https://www.googleapis.com/bigquery/v2/projects/gcp-***Project***/jobs/job_jqrRn4lK8IHqTArYAVj6cXRfLgDd?alt=json
[2018-12-19 03:57:17,304] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:17,303] {bigquery_hook.py:856} INFO - Waiting for job to complete : gcp-***Project***, job_jqrRn4lK8IHqTArYAVj6cXRfLgDd
[2018-12-19 03:57:22,311] {base_task_runner.py:98} INFO - Subtask: [2018-12-19 03:57:22,310] {discovery.py:852} INFO - URL being requested: GET https://www.googleapis.com/bigquery/v2/projects/gcp-***Project***/jobs/job_jqrRn4lK8IHqTArYAVj6cXRfLgDd?alt=json

1 个答案:

答案 0 :(得分:0)

尝试以下代码:

gcs2BQ = []
for index, currQuery in enumerate(bteqQueries.split('\n')):
    logging.info("currQuery : {}".format(currQuery))
    parameter = {
        'cur_query': currQuery
    }
    logging.info("START $$ : {}".format(parameter.get('cur_query')))
    gcs2BQ.append(BigQueryOperator(
        task_id='gcs2bq_insert_{}'.format(index),
        bql=parameter.get('cur_query'),
        write_disposition="WRITE_APPEND",
        bigquery_conn_id='bigquery_default',
        use_legacy_sql='False',
        dag=dag,
        task_concurrency=1))
    logging.info("END $$ : {}".format(parameter.get('cur_query')))

    if index == 0:
        gcs2BQ[0]
    else:
        gcs2BQ[index - 1] >> gcs2BQ[index]

基本上,task_id应该是唯一的,并且您可以使用上面的代码指定对查询的显式依赖。