我是使用airflow的新手。我试图运行一个dag并且不想做任何安排。
我想使用命令行参数运行管道并覆盖所有当前输出。我没有开始日期,没有调度,没有时间,也没有重试逻辑,我只想按顺序运行一组函数来开始。
文档始终包含日期。
airflow test tutorial print_date 2015-06-01
我想运行dag以便它执行所有函数并忽略任何先前的运行。如何从dag中删除所有日期和日期逻辑?
我有教程dag文件的修改版本:
"""
Code that goes along with the Airflow tutorial located at:
https://github.com/airbnb/airflow/blob/master/airflow/example_dags/tutorial.py
"""
import os
import cPickle
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import datetime
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['airflow@airflow.com'],
'email_on_failure': False,
'email_on_retry': False,
'schedule_interval': '@once'
}
dag = DAG('tutorial_me', default_args=default_args)
def save_file(filenm):
with open(filenm, 'wb') as pickle_file:
cPickle.dump(['1','2',3], pickle_file)
def delete_file(filenm):
print "************ THIS IS WHERE STDOUT GOES"
if os.path.exists(filenm):
os.path.remove(filenm)
# t1, t2 and t3 are examples of tasks created by instantiating operators
t1 = PythonOperator(
task_id='save_file',
python_callable=save_file,
op_kwargs=dict(filenm='__myparamfile__.txt'),
dag=dag)
t2 = PythonOperator(
task_id='remove_file',
python_callable=delete_file,
op_kwargs=dict(filenm='__myparamfile__.txt'),
dag=dag)
t1.set_upstream(t2)
我第一次运行它:
airflow run tutorial_me remove_file 2015-01-04
它工作并打印print "************ THIS IS WHERE STDOUT GOES"
行。我第二次运行它,它没有。
第二次运行后,日志文件看起来像这样
cat 2015-01-04T00\:00\:00
[2016-12-10 11:27:47,158] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags
[2016-12-10 11:27:47,214] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:47,214] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:47,227] {base_executor.py:36} INFO - Adding to queue: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:47,234] {sequential_executor.py:26} INFO - Executing command: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:48,050] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:48,101] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,102] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,942] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:48,998] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,998] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:49,020] {models.py:1196} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 1
--------------------------------------------------------------------------------
[2016-12-10 11:27:49,046] {models.py:1219} INFO - Executing <Task(PythonOperator): remove_file> on 2015-01-04 00:00:00
[2016-12-10 11:27:49,054] {python_operator.py:67} INFO - Done. Returned value was: None
[2016-12-10 11:27:55,168] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags
[2016-12-10 11:27:55,219] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:55,220] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:55,231] {base_executor.py:36} INFO - Adding to queue: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:55,236] {sequential_executor.py:26} INFO - Executing command: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:56,030] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:56,082] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,082] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,899] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:56,950] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,951] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,967] {models.py:1150} INFO -
答案 0 :(得分:1)
Airflow旨在维护其DAG运行的历史记录,以便它可以按顺序处理批量数据,并确保每个任务仅针对其DagRun运行一次。
对于您尝试做的事情,最简单的可能是忽略调度程序并从外部触发执行日期为&#34;现在&#34;的DagRun,包括完整的日期和时间。这可确保您调用的每个运行只执行一次所有任务,并且每次运行任务都独立于以前的任何运行。您将需要depends_on_past = False,您可能还需要max_active_runs为一个非常大的值,因为任何失败的DagRuns将保持&#34;活动&#34;但是你不希望它们干扰新的调用。
答案 1 :(得分:1)
我认为您的请求与airflow issue #198类似:
&#34;对于我们只需要在一系列任务实例运行中运行最新版本并将其他操作标记为跳过的情况。例如,我们可能每天都有执行数据库快照的工作。如果DAG暂停5天然后取消暂停,我们不想运行所有5,只是最新的。有了这个功能,我们将为与ETL无关的任务调度提供“cron”功能&#34;
此问题已通过记录here的LatestOnlyOperator
功能解决。
用法在https://airflow.apache.org/concepts.html#latest-run-only
的官方文档中有所描述from airflow.operators.latest_only_operator import LatestOnlyOperator
dag = DAG(
dag_id='latest_only_with_trigger',
schedule_interval=dt.timedelta(hours=4),
start_date=dt.datetime(2016, 9, 20),
)
latest_only = LatestOnlyOperator(task_id='latest_only', dag=dag)