为Django app解决celery和redis问题

时间:2016-05-16 21:43:27

标签: python django redis celery

对于Django应用,我需要帮助对celery运行redis进行排查。在这个应用程序中,用户上传照片,我正在尝试运行一个后台进程,它将根据用户投票和自上传后的时间定期对所有照片进行排名(将其视为基本的类似reddit的排名算法)。

我的项目文件夹中有celery.py(与settings.py相同的文件夹),包含:

from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myapp', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0',include=['myfolder.myapp.tasks'])
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) 

app.conf.update(
    CELERY_TASK_RESULT_EXPIRES=3600,
)

if __name__ == '__main__':
    app.start()
上述文件中提到的

tasks.py位于myapp文件夹中,其中包含:

import os
from myproject import celery_app1
import time
from myapp.models import Photo

@celery_app1.task(name='tasks.rank_all_photos')
def rank_all_photos():
    for photo in Photo.objects.all():
        photo.set_rank()
        print "ranked"
myproject文件夹中的

__init__.py包含:

from __future__ import absolute_import
from .celery import app as celery_app1

最后,来自settings.py的相关配置如下:

import djcelery
djcelery.setup_loader()

BROKER_URL = 'redis://localhost:6379/0'

CELERY_IMPORTS = ('myapp.tasks', )  

CELERY_ALWAYS_EAGER = False

CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT=True

from datetime import timedelta

CELERYBEAT_SCHEDULE = {
    'tasks.rank_all_photos': {
        'task': 'tasks.rank_all_photos',
        'schedule': timedelta(seconds=10),
    },
}

CELERY_TIMEZONE = 'UTC'

请注意,'djcelery'也包含在INSTALLED_APPS中。对我来说,以上所有看起来都是正确的。当我运行celery worker -A myproject --loglevel=INFO时,我看到了输出:

---- **** ----- 
--- * ***  * -- Linux-3.16.0-30-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         myapp:0x7f0a15acb310
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.rank_all_photos

[2016-05-17 02:19:19,733: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-05-17 02:19:19,745: INFO/MainProcess] mingle: searching for neighbors
[2016-05-17 02:19:20,750: INFO/MainProcess] mingle: all alone
/home/hassan/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-05-17 02:19:20,761: WARNING/MainProcess] /home/hassan/.virtualenvs/myenv/local/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-05-17 02:19:20,761: WARNING/MainProcess] celery@hassan ready.

我看到了输出,但我没有看到任何证据表明tasks.py已被处理。没有打印输出,也没有完成任何排名。

我是初学者,所以我可能会错过一些基本的东西。你能看一下并为我解决这个问题吗?我目前只是尝试在本地fg中测试它 - 一旦它启动并运行,我会考虑在生产中守护它。感谢。

1 个答案:

答案 0 :(得分:0)

你还需要运行芹菜节拍。

celery beat -A myproject --loglevel=INFO