使用AWS SQS代理在Django视图中阻止Celery任务

时间:2018-02-07 17:14:53

标签: django celery amazon-sqs django-celery

我正在尝试使用my_task.delay()在Django视图中运行芹菜任务。但是,该任务永远不会执行,代码在该行上被阻止,视图永远不会呈现。我使用AWS SQS作为代理,IAM用户可以完全访问SQS。

我做错了什么?

运行芹菜和Django

我这样经营芹菜:

celery -A app worker -l info

我正在另一个终端本地启动我的Django服务器:

python manage.py runserver

celery命令输出:

-------------- celery@LAPTOP-02019EM6 v4.1.0 (latentcall)
---- **** -----
--- * ***  * -- Windows-10-10.0.16299 2018-02-07 13:48:18
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         app:0x6372c18
- ** ---------- .> transport:   sqs://**redacted**:**@localhost//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF
--- ***** -----
-------------- [queues]
                .> my-queue      exchange=my-queue(direct) key=my-queue


[tasks]
. app.celery.debug_task
. counter.tasks.my_task

[2018-02-07 13:48:19,262: INFO/MainProcess] Starting new HTTPS connection (1): sa-east-1.queue.amazonaws.com
[2018-02-07 13:48:19,868: INFO/SpawnPoolWorker-1] child process 20196 calling self.run()
[2018-02-07 13:48:19,918: INFO/SpawnPoolWorker-4] child process 19984 calling self.run()
[2018-02-07 13:48:19,947: INFO/SpawnPoolWorker-3] child process 16024 calling self.run()
[2018-02-07 13:48:20,004: INFO/SpawnPoolWorker-2] child process 19572 calling self.run()
[2018-02-07 13:48:20,815: INFO/MainProcess] Connected to sqs://**redacted**:**@localhost//
[2018-02-07 13:48:20,930: INFO/MainProcess] Starting new HTTPS connection (1): sa-east-1.queue.amazonaws.com
[2018-02-07 13:48:21,307: WARNING/MainProcess] c:\users\nicolas\anaconda3\envs\djangocelery\lib\site-packages\celery\fixups\django.py:202: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2018-02-07 13:48:21,311: INFO/MainProcess] celery@LAPTOP-02019EM6 ready.

views.py

from .tasks import my_task

def index(request):
    print('New request') # This is called
    my_task.delay()
    # Never reaches here
    return HttpResponse('test')

tasks.py

...
@shared_task
def my_task():
    print('Task ran successfully') # never prints anything

settings.py

我的配置如下:

import djcelery
djcelery.setup_loader()
CELERY_BROKER_URL = 'sqs://'
CELERY_BROKER_TRANSPORT_OPTIONS = {
    'region': 'sa-east-1',
}
CELERY_BROKER_USER = '****************'
CELERY_BROKER_PASSWORD = '***************************'
CELERY_TASK_DEFAULT_QUEUE = 'my-queue'

版本:

我使用以下版本的Django和Celery:

Django==2.0.2
django-celery==3.2.2
celery==4.1.0

感谢您的帮助!

1 个答案:

答案 0 :(得分:0)

有点晚了,但也许您仍然感兴趣。我在Django和SQS上运行了Celery,但是在代码中看不到任何错误。也许您错过了celery.py文件中的某些内容?这是我的比较代码。

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'djangoappname.settings')

# do not use namespace because default amqp broker would be called
app = Celery('lsaweb')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()

您是否还检查过SQS是否正在接收消息(在SQS管理区域中尝试轮询)?