您好,我是 Airflow 的新手,我试图将自己的自定义jar导入为由 Talend Open Studio BigData 生成的DAG,并且在导入时遇到了一些麻烦通过终端我的DAG, 没有出现错误,并且我的DAG没有添加到 Airflow UI
的DAG列表中这是我的.py文件代码:
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
from airflow.utils.email import send_email
import os
import sys
bib_app = "/home/user/Docs/JObforAirflow/test/test_run.sh"
default_args = {
'owner': 'yabid',
'depends_on_past': False,
'start_date': datetime(2019, 4, 29),
'email': ['user@user.com'],
'email_on_failure': True,
'email_on_success': True,
'provide_context': True }
args = {
'owner': 'yabid'
,'email': ['user@user.com']
,'start_date': datetime(2019, 4, 25)
, 'provide_context': True }
dag = DAG('run_jar', default_args=default_args)
t1 = BashOperator(
task_id='dependency',
bash_command= bib_app,
dag=dag)
t2 = BashOperator(
task_id = 't2',
dag = dag,
bash_command = 'java -cp /home/user/Docs/JObforAirflow/test/jobbatch.jar'
)
t1.set_upstream(t2)
答案 0 :(得分:1)
您是否已将此DAG文件复制到~/airflow/dags
?
您所有的*.py
文件都需要复制到AIRFLOW_HOME/dags
,其中AIRFLOW_HOME =〜/ airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
from airflow.utils.email import send_email
import os
import sys
bib_app = "/home/user/Docs/JObforAirflow/test/test_run.sh"
default_args = {
'owner': 'yabid',
'depends_on_past': False,
'start_date': datetime(2019, 4, 25),
'email': ['user@user.com'],
'email_on_failure': True,
'email_on_success': True,
'provide_context': True
}
dag = DAG('run_jar', default_args=default_args)
t1 = BashOperator(
task_id='dependency',
bash_command= bib_app,
dag=dag)
t2 = BashOperator(
task_id = 't2',
dag = dag,
bash_command = 'java -cp /home/user/Docs/JObforAirflow/test/jobbatch.jar')
t1 >> t2
答案 1 :(得分:0)
git am
行,其中的字符串未闭合:git apply
。如果您尝试在Airflow中运行此代码,则DAG将失败。index
文件夹中。添加新的DAG文件后,建议您重新启动-3
和'email': ['user@user.com],