MssqlOperater无法执行sql语句+ apache airflow + dag

时间:2019-03-07 03:19:29

标签: airflow airflow-scheduler

下面是我写的dag,当我手动运行dag /计划dag时,任务显示成功,但未在backend执行sql语句。我已经复制了dag代码和与任务相关联的日志。谁可以帮我这个事 ?

from airflow import DAG 
from airflow.operators.mssql_operator 
import MsSqlOperator from datetime 
import datetime, timedelta 
from airflow.operators.bash_operator import BashOperator

default_args =  {   
    'owner': 'santosh',
    'depends_on_past': False,
    'email': ['santosh.dadisetti@gmail.com'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'start_date': datetime(2019,3,4)
}

dag = DAG('MSSQL_server_test1', default_args=default_args)

task1 = MsSqlOperator(
    task_id='MSSQL_Task',
    mssql_conn_id='mssql_hook',
    database='sampledb',
    sql='INSERT INTO Airflow_testing1 SELECT * FROM [dbo].[Table1]',
    dag=dag)

task2 = BashOperator(
    task_id='taskid_2',
    bash_command='echo "Hello World from Task 2"',
    dag=dag)

task1 >> task2

日志

*** Reading local file: /root/airflow/logs/MSSQL_server_test1/MSSQL_Task/2019-03-06T21:09:07.392474+00:00/1.log
[2019-03-06 21:09:24,674] {models.py:1359} INFO - Dependencies all met for TaskInstance: MSSQL_server_test1.MSSQL_Task 2019-03-06T21:09:07.392474+00:00 [queued]
[2019-03-06 21:09:24,833] {models.py:1359} INFO - Dependencies all met for TaskInstance: MSSQL_server_test1.MSSQL_Task 2019-03-06T21:09:07.392474+00:00 [queued]
[2019-03-06 21:09:24,833] {models.py:1571} INFO - 
--------------------------------------------------------------------------------
Starting attempt 1 of 2
--------------------------------------------------------------------------------

[2019-03-06 21:09:25,414] {models.py:1593} INFO - Executing <Task(MsSqlOperator): MSSQL_Task> on 2019-03-06T21:09:07.392474+00:00
[2019-03-06 21:09:25,415] {base_task_runner.py:118} INFO - Running: ['bash', '-c', 'airflow run MSSQL_server_test1 MSSQL_Task 2019-03-06T21:09:07.392474+00:00 --job_id 47 --raw -sd DAGS_FOLDER/mssql_1.py --cfg_path /tmp/tmppofyzjrw']
[2019-03-06 21:09:26,802] {base_task_runner.py:101} INFO - Job 47: Subtask MSSQL_Task [2019-03-06 21:09:26,801] {settings.py:174} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=29200
[2019-03-06 21:09:27,615] {base_task_runner.py:101} INFO - Job 47: Subtask MSSQL_Task [2019-03-06 21:09:27,613] {__init__.py:51} INFO - Using executor SequentialExecutor
[2019-03-06 21:09:29,164] {base_task_runner.py:101} INFO - Job 47: Subtask MSSQL_Task [2019-03-06 21:09:29,162] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/mssql.py
[2019-03-06 21:09:30,664] {base_task_runner.py:101} INFO - Job 47: Subtask MSSQL_Task [2019-03-06 21:09:30,663] {cli.py:520} INFO - Running <TaskInstance: MSSQL_server_test1.MSSQL_Task 2019-03-06T21:09:07.392474+00:00 [running]> on host XXXXXX
[2019-03-06 21:09:32,126] {mssql_operator.py:54} INFO - Executing: INSERT INTO Airflow_testing1 SELECT * FROM [dbo].[table1]
[2019-03-06 21:09:32,608] {logging_mixin.py:95} INFO - [2019-03-06 21:09:32,606] {base_hook.py:83} INFO - Using connection to: id: mssql_hook. Host: <host>, Port: 1433, Schema: None, Login: usr_airflow, Password: XXXXXXXX, extra: {}
[2019-03-06 21:09:32,985] {logging_mixin.py:95} INFO - [2019-03-06 21:09:32,985] {dbapi_hook.py:166} INFO - INSERT INTO Airflow_testing1 SELECT * FROM [dbo].[Table1]
[2019-03-06 21:09:34,725] {logging_mixin.py:95} INFO - [2019-03-06 21:09:34,722] {jobs.py:2527} INFO - Task exited with return code 0

0 个答案:

没有答案