Django Celery多个共享任务

时间:2015-09-14 14:11:50

标签: python django rabbitmq celery

所以我试图在我的django项目中使用两个不同的共享任务。我们称他们为task1和task2。 Task1被调用并且工作得很好但是每当代码尝试运行task2时它都不起作用。任务记录器不记录task2函数内的任何内容,并且在rabbitmq / celery中没有记录它。我有主管处理芹菜工人。有人可以指出我做错了什么,所以task1和task2都被调用了吗?

supervisord.conf

[unix_http_server]
file=/tmp/supervisor.sock                       ; path to your socket file

[supervisord]
logfile=/var/log/supervisord/supervisord.log    ; supervisord log file
logfile_maxbytes=50MB                           ; maximum size of logfile     before rotation
logfile_backups=10                              ; number of backed up logfiles
loglevel=info                                  ; info, debug, warn, trace
pidfile=/var/run/supervisord.pid                ; pidfile location
minfds=1024                                     ; number of startup file descriptors
minprocs=200                                    ; number of process descriptors
childlogdir=/var/log/supervisord/               ; where child log files will live

[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface

[supervisorctl]
serverurl=unix:///tmp/supervisor.sock         ; use a unix:// URL  for a unix socket

[include]
files=celeryd.conf

celeryd.conf

; ==================================
;  celery worker supervisor
; ==================================

[program:celery]
; Set full path to celery program if using virtualenv
command=/path/to/celery worker -A proj --loglevel=INFO

directory=/path/to/proj
numprocs=1
stdout_logfile=/var/log/celery/worker.log
stderr_logfile=/var/log/celery/worker.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

凸出/ celery.py

from __future__ import absolute_import

import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

from django.conf import settings

app = Celery('proj',
     broker='amqp://name:password@localhost:5672/vhost',
     include=['proj.tasks'])

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

凸出/ tasks.py

from __future__ import absolute_import

from celery import shared_task
from celery.utils.log import get_task_logger
from django.conf import settings

celery_logger = get_task_logger('celery-task')

@shared_task
def task1(args):
    # Do stuff

@shared_task
def task2(args):
    # Do stuff

凸出/ util的/ process.py

from ..tasks import task1, task2

def function(args):
    task1.delay(args)
    if bool:
        task2.delay(args)

0 个答案:

没有答案