使用Django应用程序的主管对运行芹菜工作者进行故障排除

时间:2016-05-19 20:50:57

标签: django redis django-admin celery supervisor

我有一个Django应用程序,我的目标是通过redis运行芹菜任务。

项目文件夹结构如下:

/mhb11/myfolder/myproject
├── myproject
│   ├── celery.py       # The Celery app file
│   ├── __init__.py     # The project module file (modified)
│   ├── settings.py     # Including Celery settings
│   ├── urls.py
│   └── wsgi.py
├── manage.py
├── celerybeat-schedule
└── myapp
    ├── __init__.py
    ├── models.py
    ├── tasks.py        # File containing tasks for this app
    ├── tests.py
    └── views.py

我在celery.conf中有/etc/supervisor/conf.d,其中包含:

[program:celery]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998

/etc/supervisor/logs中,我有一个名为celery-worker.log的空文件。设置完成后,我运行了以下命令:

sudo supervisorctl reread
sudo supervisorctl update

这样做之后,我的芹菜工人应该开始,但他们没有。即我设置的celery-worker.log文件中没有显示任何内容。我不知道我错过了什么,因为这是我第一次设置所有这些。你可以帮我解决这个问题吗?

djcelery是INSTALLED_APPS的一部分。此外,settings.py中的其他相关设置是:

import djcelery
djcelery.setup_loader()

BROKER_URL = 'redis://localhost:6379/0'

BROKER_TRANSPORT = 'redis'

CELERY_IMPORTS = ('myapp.tasks', )  

CELERY_ALWAYS_EAGER = False

CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True

from datetime import timedelta

CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'

# CELERYBEAT_SCHEDULE = {
#   'tasks.rank_all_photos': {
#       'task': 'tasks.rank_all_photos',
#       'schedule': timedelta(seconds=30),
#   },
# }

CELERY_TIMEZONE = 'UTC'

我的celery.py包含:

#this is the celery daemon
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myapp', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['myfolder.myapp.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) 

app.conf.update(
    CELERY_TASK_RESULT_EXPIRES=3600,
)

if __name__ == '__main__':
    app.start()

__init__.py包含:

from __future__ import absolute_import
from .celery import app as celery_app1

tasks.py包含:

import os
from myproject import celery_app1
import time
from myapp.models import Photo

@celery_app1.task(name='tasks.rank_all_photos')
def rank_all_photos():
    for photo in Photo.objects.order_by('-id')[:400]:
        photo.set_rank()

最后,在我的Django管理员面板中,我还设置了crontabperiodic task

我应该怎样做才能让一切顺利?

1 个答案:

答案 0 :(得分:0)

您正在运行您的工作人员,工作人员只需执行任务,但需要将任务放入队列以供工作人员查找任务。 Celery beat根据Django Admin或者schedule表文件设置的时间表将任务放入队列。在击败队列之后,工作人员找到并执行它。

所以你需要分开运行芹菜节拍过程。一个单独的主管流程

command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info

使用定期/计划任务时需要芹菜打败。如果您只是通过调用任务的.delay()方法手动排队任务,那么您不需要Celery beat来运行。

所以你的2个主管文件将是

<强>打

[program:celerybeat]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app beat -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-beat.log
stderr_logfile = /etc/supervisor/logs/celery-beatlog
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998

<强>工人

[program:celeryworker]
command=/home/mhb11/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/bin/celery --app=myproject.celery:app worker -l info
directory = /home/mhb11/myfolder/myproject
user=mhb11
numprocs=1
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998