自动发现任务在Celery4中不起作用

时间:2018-11-14 13:36:45

标签: python django celery

我以前在此项目中一直使用Celery,我的应用程序目录中有task.py文件,在Celery.py文件中已定义并计划了一些任务,但是当我运行celery时-A项目工作人员-l通知了这些任务由于找不到原因,无法找到app / tasks.py文件中的内容。 这是我的task.py文件

@shared_task
def fetch_interface_logs(*products):
"""
Function responsible for fetching the interface logs
"""
# print(products)
for product in products:
    print(product[0])
    headers = {'X-Auth-Token': ACCESS_TOKEN, "Content-Type": 'application/json'}
    log_URL = BASE_URL_PROD + ACTION_URLS.get('logs').format(product[0])
    interface_resp = requests.get(log_URL, headers=headers).json()

    if Cloud.objects.filter(product_uid=product[0]):
        base_obj = Cloud.objects.get(product_uid=product[0])

    elif Cloudx.objects.filter(product_uid=product[0]):
        base_obj = Cloudx.objects.get(product_uid=product[0])
    interface_json = interface_resp['data']

    for log in interface_json:
        if log['event']:
            interface = InterfaceLogs()
            interface.event_name = log['event']
            interface.message = log['message']
            interface.resource_type = log['resource_type']
            interface.log_time = arrow.get(int(log['time']) / 1000).datetime

            interface.save()
            # print(interface.log_time)

            base_obj.interface_logs.add(interface)
            base_obj.save()
return

这是Celery.py文件的代码

from __future__ import absolute_import
import os
from celery import Celery
from celery import shared_task
from django.conf import settings
from celery.schedules import crontab
import django

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cloud_sdn.settings')
django.setup()
app = Celery('cloud_sdn')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object(settings, namespace='CELERY')

# Load all the tasks automatically
app.autodiscover_tasks()

from app.models import Cloudx, Cloud


app.conf.beat_schedule = {

    'add-cloudx_interface-logs-everyday-afternoon': {
        'task': 'fetch_interface_logs',
        'schedule': crontab(hour=17, minute=2),
        'args': list(Cloudx.objects.all().values_list('product_uid')),

    },
    'add-cloud_interface-logs-everyday-afternoon': {
        'task': 'fetch_interface_logs',
        'schedule': crontab(hour=17, minute=12),
        'args': list(Cloud.objects.all().values_list('product_uid')),

    }

这是 init .py文件

的代码
from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']

我正在使用Redis作为代理。

1 个答案:

答案 0 :(得分:0)

您是否使用正确的目录启动工作进程?您的目录为cloud_sdn

celery -A <directory> worker -l info