从cmd调用时任务延迟不起作用

时间:2019-01-06 06:15:28

标签: django python-3.x celery celery-task

我正在用一个cmd从外壳运行芹菜工作者,我正在运行任务,但是当我像TestTaskOne.delay()这样调用任务时,它无法正常工作,cmd只是在那里暂停,我必须以{{1 }},并且工作人员也没有得到任何任务。

任何想法为什么会发生这种情况。

对于芹菜工人,我正在使用ctrl+c

tasks.py

celery -A Project worker -l info -P eventlet

celery.py

from __future__ import absolute_import, unicode_literals
from celery import task

@task
def TestTaskOne():
    msg = "DEFAULT   TASK   IS   WORKING......"
    return msg

settings.py

from __future__ import absolute_import, unicode_literals
import os, logging
from celery import Celery
from celery.schedules import crontab


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'RestUserAPI.settings')

app = Celery('UserAPI')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

1 个答案:

答案 0 :(得分:0)

我浏览了celery文档,似乎错过了在项目__init__.py中添加celery应用的

添加这些行后,延迟功能开始工作。

__ init__py

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']