调用apply_async时,芹菜不会返回或失败。与celery_beat一起使用

时间:2019-11-11 15:12:46

标签: django rabbitmq celery amqp

我在使用apply_async调用芹菜任务时遇到问题。 我在settings.py中:

CELERY_BROKER_TRANSPORT_OPTIONS = {'confirm_publish': True}
CELERY_BROKER_URL = env('RABBITMQ_URL')
print(CELERY_BROKER_URL) //pyamqp://un:pw@app-rabbitmq:5672
CELERY_TASK_QUEUES = (
    Queue('default', Exchange('default', type='direct'), routing_key='default'),
    Queue('email', Exchange('email', type='direct'), routing_key='email'),
)
CELERY_TASK_ROUTES = {
    'core.services.CoreSendEmailTaskService.*': {
        'exchange': 'email',
        'routing_key': 'email'
    },
}
CELERY_ACCEPT_CONTENT = ['json']

CELERY_TASK_DEFAULT_QUEUE = 'default'
CELERY_TASK_ACKS_LATE = True
CELERY_TASK_SERIALIZER = 'json'

CELERY_RESULT_EXPIRES = 600
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = env('CELERY_RESULT_BACKEND_URL') //redis://app-redis:6379/3
CELERY_RESULT_PERSISTENT = False

CELERY_WORKER_TASK_TIME_LIMIT = 65
CELERY_WORKER_TASK_SOFT_TIME_LIMIT = 60
CELERY_WORKER_HIJACK_ROOT_LOGGER = False

在project_config / celery_config / tasks / __ init__.py中:

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app

__all__ = ('app',)

以及位于project_config / celery_config / tasks / celery.py中

class CeleryApp(celery.Celery):
    def on_configure(self):
        sentry_dns = os.environ.get('DJANGO_SENTRY_DNS', None)

        if sentry_dns and os.environ.get('ENVIRONMENT', 'local') == 'production':
            client = raven.Client(
                sentry_dns
            )

            register_logger_signal(client)
            register_signal(client)


app = CeleryApp('tasks')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()


class CeleryTasksConfig(AppConfig):
    name = 'project_config.celery_config.tasks'
    verbose_name = 'Celery Tasks'

    def ready(self):
        app.config_from_object('django.conf:settings', namespace='CELERY')
        app.autodiscover_tasks()

奇怪的是,任务在登台和生产时执行,但是在本地却没有。如果在节拍中安排任务,则该任务也将在本地正常运行。

在本地使用以下命令启动工作程序: celery -A project_config.celery_config.tasks worker -O fair --loglevel=DEBUG --maxtasksperchild=1000 --queues=default,email -P prefork

0 个答案:

没有答案