我正在代码构建中的容器中运行气流,当前它执行所有操作,但触发DAG的部分失败。
- sudo sh scripts/setup.sh
- pipenv --three install
- airflow initdb
- airflow scheduler > ~/scheduler.log 2>&1 &
- airflow list_dags -sd $(pwd)/dags
- airflow trigger_dag -sd $(pwd)/dags Pampa
当我使用list_dags时,它会显示
-------------------------------------------------------------------
DAGS
-------------------------------------------------------------------
Pampa
但是它不执行DAG。
airflow trigger_dag -sd $(pwd)/dags Pampa
[2018-07-05 20:04:36,495] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-07-05 20:04:36,556] {models.py:189} INFO -
Filling up the DagBag from /codebuild/output/src188373663/dags
Traceback (most recent call last): File "/usr/local/bin/airflow",
line 27, in <module>
args.func(args) File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 199,
in trigger_dag
execution_date=args.exec_date) File "/usr/local/lib/python3.6/site-packages/airflow/api/client/local_client.py",
line 27, in trigger_dag
execution_date=execution_date) File "/usr/local/lib/python3.6/site-packages/airflow/api/common/experimental/trigger_dag.py",
line 27, in trigger_dag
raise AirflowException("Dag id {} not found".format(dag_id)) airflow.exceptions.AirflowException: Dag id Pampa not found
答案 0 :(得分:0)
错误在$ AIRFLOW_HOME中,它采用以下文件夹的路径:
$ AIRFLOW_HOME / home / ubuntu
但是dags位于/ home / ubuntu / airflow / dags /中,因此它找不到dags,当我在list_dags中指定-sd和子文件夹路径时,它只能找到dags。
我必须更改$ AIRFLOW_HOME的存储方式。