尝试使用systemd

时间:2018-09-12 10:05:09

标签: apache ubuntu airflow systemd

我正在尝试在具有airflow的ubuntu服务器上运行systemd。我遵循了气流文档的快速入门指南和教程,并设法安装了气流并使用以下命令成功运行了气流:

airflow webserver -p 8080

在安装了systemd并进行了configuration files的反复试验后,我设法使用命令运行气流

sudo systemctl start airflow

气流一直运行了一周,直到今天我使用命令将其重新启动

sudo systemctl restart airflow

运行sudo systemctl status airflow现在会给我以下两条消息之一:

● airflow.service - Airflow webserver daemon
 Loaded: loaded (/lib/systemd/system/airflow.service; enabled; vendor preset: enabled)
 Active: activating (auto-restart) (Result: exit-code) since Wed 2018-09-12 09:23:01 UTC; 1s ago
Process: 3115 ExecStart=/opt/miniconda3/bin/airflow webserver -p 8080 --pid /home/user/airflow/airflow-webserver.pid --daemon (code=exited, status=1/FAILURE)
Main PID: 3115 (code=exited, status=1/FAILURE)

Sep 12 09:23:01 server-service systemd[1]: airflow.service: Main process exited, code=exited, status=1/FAILURE
Sep 12 09:23:01 server-service systemd[1]: airflow.service: Unit entered failed state.
Sep 12 09:23:01 server-service systemd[1]: airflow.service: Failed with result 'exit-code'.

● airflow.service - Airflow webserver daemon
 Loaded: loaded (/lib/systemd/system/airflow.service; enabled; vendor preset: enabled)
 Active: active (running) since Wed 2018-09-12 09:23:54 UTC; 1s ago
Main PID: 3399 (airflow)
  Tasks: 1
 Memory: 56.1M
    CPU: 1.203s
 CGroup: /system.slice/airflow.service
         └─3399 /opt/miniconda3/bin/python /opt/miniconda3/bin/airflow webserver -p 8080 --pid /home/user/airflow/airflow-webserver.pid --daemon

Sep 12 09:23:54 server-service systemd[1]: Stopped Airflow webserver daemon.
Sep 12 09:23:54 server-service systemd[1]: Started Airflow webserver daemon.
Sep 12 09:23:54 server-service airflow[3399]: [2018-09-12 09:23:54,372] {__init__.py:57} INFO - Using executor SequentialExecutor
Sep 12 09:23:55 server-service airflow[3399]:   ____________       _____________
Sep 12 09:23:55 server-service airflow[3399]:  ____    |__( )_________  __/__  /________      __
Sep 12 09:23:55 server-service airflow[3399]: ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
Sep 12 09:23:55 server-service airflow[3399]: ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
Sep 12 09:23:55 server-service airflow[3399]:  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
Sep 12 09:23:55 server-service airflow[3399]:  
Sep 12 09:23:55 server-service airflow[3399]: [2018-09-12 09:23:55,124] [3399] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags

我认为当systemd无法启动气流时返回第一条消息,而在systemd仍在启动气流过程中返回第二条消息。

由于第一条错误消息包含airflow.service: Service hold-off time over, scheduling restart.,我以为我可能拥有this problem,但是运行sudo systemctl enable airflow.service并不能解决问题(我认为airflow.service仍按原样启用在此处指示:Loaded: loaded (/lib/systemd/system/airflow.service; enabled; vendor preset: enabled)

在尝试解决问题时,我发现了一些我不理解的怪异事物:

  • 根据airflow quick start page,手动运行airflow将在气流主目录中创建一个名为airflow-webserver.pid的文件,而使用systemd运行airflow将在气流主目录中创建一个名为webserver.pid的文件。 /run/airflow目录。最初,当我尝试使用systemd运行气流时,我注意到未创建/run/airflow/webserver.pid。设置PIDFile=/home/user/airflow/airflow-webserver.pid解决了这个问题; airflow-webserver.pid文件中提供的worker pid使系统运行正常。但是现在我已经运行了sudo systemctl restart airflow,它不再起作用了;运行airflow webserver -p 8080不会创建我指向的airflow-webserver.pid

  • 由于运行气流不再自动创建/run/airflow/webserver.pid/home/user/airflow/airflow-webserver.pid文件,因此我尝试在所需目录中手动创建它们。但是,如果我在创建/run/airflow/webserver.pid文件之后使用systemd运行气流,它将被删除(而不是替换),并且如果我在创建airflow webserver -p 8080文件之后使用/run/airflow/webserver.pid手动运行气流,则文件被删除。

我的airflow.service文件如下:

[Unit]
Description=Airflow webserver daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
EnvironmentFile=/etc/sysconfig/airflow
PIDFile=/home/user/airflow/airflow-webserver.pid
User=%i
Group=%i
Type=simple
ExecStart=/opt/miniconda3/bin/airflow webserver -p 8080 --pid /home/user/airflow/airflow-webserver.pid --daemon

Restart=on-failure
RestartSec=5s
PrivateTmp=true

[Install]
WantedBy=multi-user.target

问题:我该如何解决这些问题,以便可以在systemd上运行气流?

编辑:再次重新启动systemd守护程序后,我设法使气流开始运转(或者至少看起来如此)。运行systemctl status airflow返回:

● airflow.service - Airflow webserver daemon
   Loaded: loaded (/lib/systemd/system/airflow.service; enabled; vendor preset: enabled)
   Active: active (running) since Wed 2018-09-12 10:49:17 UTC; 6min ago
 Main PID: 30054
    Tasks: 0
   Memory: 388.0K
      CPU: 2.987s
   CGroup: /system.slice/airflow.service

Sep 12 10:49:22 server-service airflow[30031]:   File "/opt/miniconda3/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
Sep 12 10:49:22 server-service airflow[30031]:     reraise(type(exception), exception, tb=exc_tb, cause=cause)
Sep 12 10:49:22 server-service airflow[30031]:   File "/opt/miniconda3/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
Sep 12 10:49:22 server-service airflow[30031]:     raise value.with_traceback(tb)
Sep 12 10:49:22 server-service airflow[30031]:   File "/opt/miniconda3/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
Sep 12 10:49:22 server-service airflow[30031]:     context)
Sep 12 10:49:22 server-service airflow[30031]:   File "/opt/miniconda3/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
Sep 12 10:49:22 server-service airflow[30031]:     cursor.execute(statement, parameters)
Sep 12 10:49:22 server-service airflow[30031]: sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: connection [SQL: 'SELECT connection.conn_id AS connection_conn_id \nFROM connection G
Sep 12 10:49:23 server-service systemd[1]: airflow.service: Supervising process 30054 which is not our child. We'll most likely not notice when it exits.
lines 1-19/19 (END)

很遗憾,我无法在浏览器中访问气流。此外,使用systemd或手动启动气流不会产生所需文件/run/airflow/webserver.pid/home/user/airflow/airflow-webserver.pid。我试图用sudo find ~/ -type f -name "webserver.pid"检查它们是否存在于其他地方,但这不会返回任何内容。

我认为消息Supervising process 30054 which is not our child. We'll most likely not notice when it exits.与我的问题有关,因为过去气流在systemd成功运行时没有得到此消息。可能是systemctl status airflow指示气流已经运行了6分钟,因为systemd并未注意到pid为30054的工作者不再处于活动状态吗?

编辑2:我发现了为什么airflow-webserver.pid不是由气流产生的。当您运行airflow webserver -p 8080时,airflow确实会创建.pid文件,但是当您停止webserver时,systemd会再次删除.pid文件(如果airflow本身不这样做)。这就解释了为什么airflow-webserver.pid不存在的原因,但也没有解释为什么webserver.pid不在/run/airflow目录中的原因。

2 个答案:

答案 0 :(得分:1)

我知道我正在挖掘一个过时的帖子,但是我也试图弄清楚为什么我无法让调度程序在服务器运行时自动运行。

我确实找到了适用于Ubuntu 18.04和18.10的解决方案,希望对您有所帮助。

我在链接install Airflow and PostgreSQL的后端提供了如何here的完整文章。

**来自我文章的后半部分 从本质上讲,这取决于对airflow-scheduler.system文件进行特定更改。

这是在Ubuntu上实现的“陷阱”之一。创建了Airflow的开发团队将其设计为可以在不同的Linux发行版上运行,因此需要进行一些微小(但很关键)的更改,以便在服务器打开时Airflow能够自动运行。默认的systemd服务文件最初看起来像这样:

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
EnvironmentFile=/etc/sysconfig/airflow
User=airflow
Group=airflow
Type=simple
ExecStart=/bin/airflow scheduler
Restart=always
RestartSec=5s

[Install]
WantedBy=multi-user.target

但是,由于“ EnvironmentFile”协议无法在Ubuntu 18上运行,因此无法使用。相反,注释掉该行并添加:

Environment="PATH=/home/ubuntu/anaconda3/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

如果您还希望UI也自动启动,则可能至少要为Airflow Scheduler和Web服务器创建systemd服务文件。实际上,在此实现中我们确实希望两者都实现,因此我们将创建两个文件airflow-scheduler.service和airflow-webserver.service。两者都将被复制到/ etc / systemd / system文件夹中。如下:


airflow-scheduler.service

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
#EnvironmentFile=/etc/default/airflow
Environment="PATH=/home/ubuntu/anaconda3/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
User=airflow
Group=airflow
Type=simple
ExecStart=/home/ubuntu/anaconda3/envs/airflow/bin/airflow scheduler
Restart=always
RestartSec=5s

[Install]
WantedBy=multi-user.target
#airflow-webserver.service

airflow-webserver.service

[Unit]
Description=Airflow webserver daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
#EnvironmentFile=/etc/default/airflow
Environment="PATH=/home/ubuntu/anaconda3/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
User=airflow
Group=airflow
Type=simple
ExecStart=/home/ubuntu/anaconda3/envs/airflow/bin/airflow webserver -p 8085 --pid /home/ubuntu/airflow/airflow-webserver.pid
Restart=on-failure
RestartSec=5s
PrivateTmp=true

[Install]
WantedBy=multi-user.target

最后,通过超级用户复制命令sudo cp将这两个文件复制到/ etc / systemd / systemd文件夹中,是时候启动了:

sudo systemctl启用气流计划程序 sudo systemctl启动气流计划程序 sudo systemctl启用airflow-webserver sudo systemctl启动airflow-webserver

答案 1 :(得分:0)

该错误sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: connection表示您的Airflow流程无法访问已初始化的数据库。确定要安装Airflow网络服务器之前,您确定先运行airflow initdb吗?

我一直在AWS Airflow Stack的systemd下运行Airflow,您可以在其中找到配置参数。为了完整起见,我将在这里转录我的配置文件,但是我无法仅通过查看为什么您的配置不起作用来找到答案。

我的配置是经过定制的,以使其可以在ec2-user机器内的用户Amazon Linux 2下运行,但是我认为它也应适用于Ubuntu。观察到由于我正在运行数据库,redis和其他计算机上的所有其他内容,因此将它们从After部分中删除。

        /usr/bin/turbine:
            #!/bin/sh
            exec airflow scheduler

        /etc/sysconfig/airflow:
            AIRFLOW_HOME=/efs/airflow
            AIRFLOW__CELERY__DEFAULT_QUEUE=${queue}
            ... your environment configs
            AWS_DEFAULT_REGION=${AWS::Region}

        /usr/lib/systemd/system/airflow.service:
            [Unit]
            Description=Airflow daemon
            After=network.target
            [Service]
            EnvironmentFile=/etc/sysconfig/airflow
            User=ec2-user
            Group=ec2-user
            Type=simple
            ExecStart=/usr/bin/turbine
            Restart=always
            RestartSec=5s
            [Install]
            WantedBy=multi-user.target

        /usr/lib/tmpfiles.d/airflow.conf:
            D /run/airflow 0755 ec2-user ec2-user

除了这些之外,我还设置了一个观察程序服务,以确保我们始终将最新的环境文件与systemd一起使用:

        /usr/lib/systemd/system/watcher.service:
            [Unit]
            Description=Airflow configuration watcher
            After=network.target
            [Service]
            Type=oneshot
            ExecStartPre=/usr/bin/systemctl daemon-reload
            ExecStart=/usr/bin/systemctl restart airflow
            [Install]
            WantedBy=multi-user.target

        /usr/lib/systemd/system/watcher.path:
            [Path]
            PathModified=/etc/sysconfig/airflow
            [Install]
            WantedBy=multi-user.target

所有设置都通过

systemctl enable airflow.service
systemctl enable watcher.path
systemctl start airflow.service
systemctl start watcher.path