django中的异步芹菜任务调用没有被推入rabbitmq队列

时间:2012-12-06 06:20:16

标签: python django rabbitmq celery

我在django 1.4.1版本的系统中安装了celery == 3.0.12和djcelery == 3.0.11。我试图在我的一个项目中使用芹菜异步处理一些任务,但它无法正常工作。所以为了测试我开始了一个新的django项目,定义了样本任务add并从shell中调用它

 >>>res = add.delay(3, 5)

我尝试了res.statusres.get()res.ready(),所有这些都被阻止了。我正在使用rabbitmq管理插件监视浏览器中的rabbitmq celery队列。 celery队列处于空闲状态,没有收到任何消息。

以下是目录树。

|-- manage.py
|-- new_app
|   |-- __init__.py
|   |-- __init__.pyc
|   |-- models.py
|   |-- models.pyc
|   |-- tasks.py
|   |-- tasks.pyc
|   |-- tests.py
|   `-- views.py
`-- testapp
    |-- __init__.py
    |-- __init__.pyc
    |-- settings.py
    |-- settings.pyc
    |-- urls.py
    `-- wsgi.py

以下是文件的内容

NEW_APP / tasks.py

from celery import task

@task
def add(x ,y):
    return x + y

testapp / settings.py

import djcelery
djcelery.setup_loader()

BROKER_URL = 'amqp://'
CELERY_RESULT_BACKEND = 'amqp://'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

BROKER_HOST = 'localhost'
BROKER_PORT = 5672
BROKER_USER = 'guest'
BROKER_PASSWORD = 'guest'
BROKER_VHOST = '/'

当我运行python manage.py celeryd -l INFO时,它正在创建celery队列。 以下是控制台输出。 http://dpaste.com/841960/

RabbitMQ版本是3.0.0

rabbitmqctl list_queues

的输出
Listing queues ...
celery  0
h4ckb0x.celery.pidbox   0
...done.

1 个答案:

答案 0 :(得分:0)

尝试将您的操作与以下内容进行比较,并找到您应用中缺少的内容:

配置兔子:

首先安装它(在我们的例子中是ubuntu):

sudo apt-get update
sudo apt-get install rabbitmq-server
sudo mkdir /etc/rabbitmq/rabbitmq.conf.d

设置user / psw / vhost:

sudo rabbitmqctl delete_user guest
sudo rabbitmqctl add_user <username> <password>
sudo rabbitmqctl set_user_tags <username> administrator
sudo rabbitmqctl add_vhost <some_name_for_vhost>
sudo rabbitmqctl set_permissions -p <some_name_for_vhost> <username> ".*" ".*" ".*"')

定义设置:

settings.py

RABBIT_USERNAME = <username>
RABBIT_PASSWORD = <password>
RABBIT_HOST = 'localhost' #or some server dns/ip
RABBIT_VHOST = <some_name_for_vhost>
BROKER_URL = 'amqp://%s:%s@%s:5672/%s' % (RABBIT_USERNAME,RABBIT_PASSWORD,RABBIT_HOST,RABBIT_VHOST)

CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_ENABLE_UTC = True
CELERY_CREATE_MISSING_QUEUES = True

定义芹菜:

celery.py

from __future__ import absolute_import

import os

from celery import Celery

from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'yourapp.settings')

app = Celery('yourapp')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

定义任务:

tasks.py

from celery import task

@task(queue="my_queue")
def do_something_in_background(x1,x2):   
    #start doing something in the task
    #enter you code in here

运行工人:

(很好地杀死所有现有的,运行名为app_worker的10名工作人员,将他们连接到您的任务队列:my_queue)

run_workers.sh

#!/bin/bash
ps auxww | grep 'yourapp worker' | awk '{print $2}' | xargs kill
celery -A yourapp worker -Q my_queue -n app_worker -l info -c 10 -Ofair

定义一些测试:

test.py

import tasks
def my_func():
    tasks.do_something_in_background.delay(x1,x2)