Apache Airflow没有运行命令或未正确发送命令输出?

时间:2019-10-07 16:09:08

标签: airflow

我正在运行Apache airflow,并且每当运行aws命令时,都不会显示任何输出。该命令在worker的bash shell上运行。现在,这项工作将挂起等待。我需要做些什么来向气流发送数据以告知命令已完成吗?

日志

[2019-10-07 15:48:13,098] {bash_operator.py:91} INFO - Exporting the following env vars:
    AIRFLOW_CTX_DAG_ID=lambda_async
    AIRFLOW_CTX_TASK_ID=aws_lambda
    AIRFLOW_CTX_EXECUTION_DATE=2019-10-07T15:48:01.310140+00:00
    AIRFLOW_CTX_DAG_RUN_ID=manual__2019-10-07T15:48:01.310140+00:00

[2019-10-07 15:48:13,099] {bash_operator.py:105} INFO - Temporary script location: /tmp/airflowtmpj4r2x2xg/aws_lambda2qu0f2vl

[2019-10-07 15:48:13,099] {bash_operator.py:115} INFO - Running command: aws lambda --region us-east-1 list-functions

[2019-10-07 15:48:13,103] {bash_operator.py:124} INFO - Output:

代码

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
import datetime

default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime.datetime(2019, 7, 30),
'email': ['xxxxxxxxx'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': datetime.timedelta(minutes=5),
}


dag = DAG('lambda_async', default_args=default_args, schedule_interval=datetime.timedelta(days=1))

t1_command = "aws lambda --region us-east-1 list-functions"

t1 = BashOperator(
    task_id='aws_lambda',
    bash_command=t1_command,
    dag=dag)

t1

0 个答案:

没有答案