来自不同应用程序中不同日志文件中的Celery任务

时间:2019-03-13 09:55:47

标签: redis celery freebsd celery-task celery-log

我正在Celery服务器上寻找配置FreeBSD,但是根据日志文件,我遇到了一些问题。

我的配置:

  • FreeBSD服务器
  • 2个Django应用程序:app1和app2
  • Celery被守护并且Redis
  • 每个应用程序都有自己的Celery任务

我的Celery配置文件:

我在 / etc / default / celeryd_app1 中:

# Names of nodes to start
CELERYD_NODES="worker"

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/www/app1/venv/bin/celery"

# App instance to use
CELERY_APP="main"

# Where to chdir at start.
CELERYD_CHDIR="/usr/local/www/app1/src/"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"

# Set logging level to DEBUG
#CELERYD_LOG_LEVEL="DEBUG"

# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/app1/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/app1/%n.pid"

# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

我的 celeryd_app2

文件完全相同

带有Celery设置的Django设置文件:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False
CELERY_TASK_TRACK_STARTED = True
# Add a one-minute timeout to all Celery tasks.
CELERYD_TASK_SOFT_TIME_LIMIT = 60

两个设置都具有相同的redis端口。

我的问题:

当我为app1执行芹菜任务时,我在app2日志文件中找到了该任务的日志,出现了这样的问题:

Received unregistered task of type 'app1.task.my_task_for_app1'
...
KeyError: 'app1.task.my_task_for_app1'

我的Celery配置文件中有问题吗?我必须设置不同的Redis端口吗?如果是,我该怎么做?

非常感谢您

1 个答案:

答案 0 :(得分:1)

我想问题在于,您在两个应用程序中都使用了相同的Redis数据库:

CELERY_BROKER_URL = 'redis://localhost:6379'

查看the guide,以将Redis用作代理。只需更改每个应用程序的数据库,例如

CELERY_BROKER_URL = 'redis://localhost:6379/0'

CELERY_BROKER_URL = 'redis://localhost:6379/1'