Celery异步任务和定期任务一起

时间:2018-02-06 10:17:39

标签: python django asynchronous django-celery celerybeat

无法一起运行定期任务和异步任务。虽然,如果我注释掉周期性任务,异步任务执行得很好,否则异步任务就会被卡住。

跑步:芹菜== 4.0.2,Django == 2.0,django-celery-beat == 1.1.0,django-celery-results == 1.0.1

提到:https://github.com/celery/celery/issues/4184选择celery == 4.0.2版本,因为它似乎有效。

  

似乎是一个已知问题

     

https://github.com/celery/django-celery-beat/issues/27

     

我也做了一些挖掘我找到的唯一方法让它回归   正常是删除所有周期性任务并重新启动芹菜节拍。 ~rh0dium

celery.py

import django
import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bid.settings')

# Setup django project
django.setup()

app = Celery('bid')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

settings.py

INSTALLED_APPS = (
         ...
         'django_celery_results',
         'django_celery_beat',
     )

# Celery related settings

CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 43200, }
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_CONTENT_ENCODING = 'utf-8'
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'

定期任务

@periodic_task(run_every=crontab(hour=7, minute=30), name="send-vendor-status-everyday")
def send_vendor_status():
    return timezone.now()

异步任务

@shared_task
def vendor_creation_email(id):
   return "Email Sent"

异步任务调用者

vendor_creation_email.apply_async(args=[instance.id, ]) # main thread gets stuck here, if periodic jobs are scheduled.

按照以下方式运行工作人员

celery worker -A bid -l debug -B

请帮忙。

1 个答案:

答案 0 :(得分:2)

以下是一些观察结果,这些观察结果来自多次试验和错误,并深入了解芹菜的源代码。

  1. @periodic_task 已弃用。因此它不起作用。
  2. 来自他们的源代码:

    #venv36/lib/python3.6/site-packages/celery/task/base.py
    def periodic_task(*args, **options):
        """Deprecated decorator, please use :setting:`beat_schedule`."""
        return task(**dict({'base': PeriodicTask}, **options))
    
    1. 使用UTC作为基准时区,以避免以后与时区相关的混淆。配置周期性任务以触发相对于UTC的计算时间。例如“亚洲/加尔各答”将时间减少5小时30分钟。

    2. 按如下方式创建celery.py:

    3. <强> celery.py

      import django
      import os 
      
      from celery import Celery
      # set the default Django settings module for the 'celery' program.
      from celery.schedules import crontab
      
      os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
      # Setup django project
      django.setup()
      
      app = Celery('proj')
      
      # Using a string here means the worker don't have to serialize
      # the configuration object to child processes.
      # - namespace='CELERY' means all celery-related configuration keys
      #   should have a `CELERY_` prefix.
      app.config_from_object('django.conf:settings', namespace='CELERY')
      
      # Load task modules from all registered Django app configs.
      app.autodiscover_tasks()
      app.conf.beat_schedule = {
          'test_task': {
              'task': 'test_task',
              'schedule': crontab(hour=2,minute=0),
          }
      }
      

      任务可以在任何app下的tasks.py中,如下所示

      @shared_task(name="test_task")
      def test_add():
          print("Testing beat service")
      

      celery worker -A proj -l infocelery beat -A proj -l info用于工作人员和节拍,以及代理商,例如Redis的。这个设置应该可以正常工作。