我已将气流设置配置为根据this与systemd一起运行。这两天很棒,但是它抛出了一些错误,我不知道该如何解决。运行sudo systemctl start airflow-webserver.service
实际上没有任何作用,但运行airflow webserver
是可行的(但是,出于我们的目的,需要使用systemd)。
要了解什么是错误,我运行sudo systemctl status airflow-webserver.service
,它给出以下状态和错误:
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: [2019-02-20 18:54:43,774] {models.py:258} INFO - Filling up the DagBag from /home/ec2-user/airflow/dags
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: /home/ec2-user/airflow/dags/statcan_1410009501.py:33: SyntaxWarning: name 'pg_hook' is assigned to before global declaration
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: global pg_hook
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:346: DeprecationWarning: Importing 'PythonOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operat...irely in Airflow 2.0.
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: DeprecationWarning)
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:346: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator...irely in Airflow 2.0.
Feb 20 18:54:43 ip-172-31-25-17.ec2.internal airflow[19660]: DeprecationWarning)
Feb 20 18:54:44 ip-172-31-25-17.ec2.internal airflow[19660]: [2019-02-20 18:54:44,528] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
Feb 20 18:54:45 ip-172-31-25-17.ec2.internal airflow[19660]: [2019-02-20 18:54:45 +0000] [19733] [INFO] Starting gunicorn 19.9.0
Feb 20 18:54:45 ip-172-31-25-17.ec2.internal airflow[19660]: Error: /run/airflow doesn't exist. Can't create pidfile.
经过运行systemctl status airflow-scheduler.service
和journalctl -f
的验证后,调度程序似乎工作正常。
这是以下systemd文件的设置:
/usr/lib/systemd/system/airflow-webserver.service
[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service
[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=ec2-user
Type=simple
ExecStart=/bin/airflow scheduler
Restart=always
RestartSec=5s
[Install]
WantedBy=multi-user.target
/etc/tmpfiles.d/airflow.conf
D /run/airflow 0755 airflow airflow
/ etc / sysconfig / airflow
AIRFLOW_CONFIG= $AIRFLOW_HOME/airflow.cfg
AIRFLOW_HOME= /home/ec2-user/airflow
在发生此错误之前,我将气流安装从根目录移到了主目录。不知道这是否会影响我的设置,但请把它放在相关的位置。
任何人都可以提供该错误的任何说明以及解决方法吗?我尽力将systemd配置为尽可能接近所指示的内容,但也许我遗漏了一些东西?
编辑2:
对不起,我粘贴了错误的代码。这是我的airflow-webserver.service代码
[Unit]
Description=Airflow webserver daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service
[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=ec2-user
Type=simple
ExecStart=/bin/airflow webserver --pid /run/airflow/webserver.pid
Restart=on-failure
RestartSec=5s
PrivateTmp=true
[Install]
WantedBy=multi-user.target
答案 0 :(得分:1)
I encountered this issue too and was able to resolve the issue by providing runtime directory parameters under [Service]
in the airflow-webserver.service
unit file:
[Service]
RuntimeDirectory=airflow
RuntimeDirectoryMode=0775
I was not able to figure out how to get it to work with /etc/tmpfiles.d/airflow.conf
alone.
答案 1 :(得分:0)
您似乎正在运行调度程序而不是Web服务器:
ExecStart=/bin/airflow scheduler
您可能想要执行以下操作:
ExecStart=/bin/airflow webserver -p 8080 --pid /run/airflow/webserver.pid
也许您只是复制粘贴了错误的文件,在这种情况下,请共享正确的文件(airflow-webserver.service),以便我们为您解决问题。
答案 2 :(得分:0)
/etc/tmpfiles.d/airflow.conf
服务在启动时使用配置文件systemd-tmpfiles-setup
。因此,服务器重新启动应创建/ run / airflow目录。无法按照https://github.com/systemd/systemd/issues/8684重新启动此服务。
如以上链接中所建议,将airflow.conf
复制到/etc/tmpfiles.d/
之后,只需运行sudo systemd-tmpfiles --create
,就应该创建/run/airflow
。