气流日期错误dag.normalize_schedule TypeError

时间:2018-09-05 02:10:11

标签: docker typeerror airflow

我遇到了如下的apache-airflow datetime问题

Process DagFileProcessor238215-Process:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 388, in helper
    pickle_dags)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1832, in process_file
    self._process_dags(dagbag, dags, ti_keys_to_schedule)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1422, in _process_dags
    dag_run = self.create_dag_run(dag)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 856, in create_dag_run
    next_run_date = dag.normalize_schedule(min(task_start_dates))
TypeError: '<' not supported between instances of 'str' and 'datetime.datetime'

我在zhongjiajie/docker-airflow并基于puckel/docker-airflow的docker中使用apache-airflow。

和我的DAG定义如下

from airflow import DAG
from airflow.models import Variable
from airflow.operators.dummy_operator import DummyOperator
from udf.udf_hive_operator import HiveOperator
from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
from udf.udf_hive_to_oracle import HiveToOracleTransfer
from udf.utils.date_utils import gen_history_date_para, today_belong_business_day
from datetime import datetime, timedelta

TMPL_SQL_PATH = Variable.get("sql_path")
HIVE_DB = "default"
NOSTRICT_HIVE_PARTITION_MODE = "set hive.exec.dynamic.partition.mode=nonstrict;\n"

default_args = {
    "owner": "xx_monitor",
    "description": "workflow for xx monitor system",
    "depends_on_past": False,
    "start_date": datetime(2014, 1, 1),
    "email": ["airflow@airflow.com"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
    # "queue": "bash_queue",
    # "pool": "backfill",
    # "priority_weight": 10,
    # "end_date": datetime(2016, 1, 1),
}

dag = DAG(
    dag_id="drug_monitor",
    default_args=default_args,
    schedule_interval="0 18 * * *",
    template_searchpath=TMPL_SQL_PATH
)

udf模块是我的用户定义功能

但是奇怪的事情发生了

  • 我去webserver UI转动dag ON,它仍然失败,并且如上所述,我在schedule中看到错误消息
  • 我在cli中将backfill用作airflow backfill -s 20140101 -e 20180101 <DAG_ID>,然后转到schedule,错误消息消失,所有任务开始排定或排入队列

我尝试了几种方法来解决此问题,但失败了。

  • 尝试将start_date中的default_args设置为airflow.utils.dates.days_ago对象,但失败,例如days_ago(2018, 9, 5)
  • 尝试将start_date中的default_args设置为airflow.utils.timezone.datetime对象,但失败,例如datetime(2018, 9, 5)
  • 尝试将schedule_interval中的DAG设置为DAG-runs变量,例如@daily,但失败
  • 尝试将schedule_interval中的DAG设置为datetime.timedelta对象,但失败

每个人都遇到过这样的问题,我该如何解决?

1 个答案:

答案 0 :(得分:0)

在我的Dag文件中,我使用参数start_date定义了一个任务,并修复了该问题以重命名参数。