Celery任务在Django框架中不起作用

时间:2019-05-19 06:44:32

标签: django redis celery

我尝试使用Django框架中的Celery和Redis Broker作为异步任务向用户发送5次send_email的代码。我的Celery服务器正在运行,并且它正在响应celery cli接口,即使它正在接收来自Django的任务,但此后我也会收到类似

的错误消息
Traceback (most recent call last):
  File "c:\users\vipin\appdata\local\programs\python\python3
es\billiard\pool.py", line 358, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\vipin\appdata\local\programs\python\python3
es\celery\app\trace.py", line 544, in _fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0) 



task.py -
from celery.decorators import task
from django.core.mail import EmailMessage
import time

@task(name="Sending_Emails")
def send_email(to_email,message):
    time1 = 1
    while(time1 != 5):
        print("Sending Email")
        email = EmailMessage('Checking Asynchronous Task', message+str(time1), to=[to_email])
        email.send()
        time.sleep(1)
        time1 += 1

views.py - 
print("sending for Queue")
send_email.delay(request.user.email,"Email sent : ")
print("sent for Queue")

settings.py - 
# CELERY STUFF
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/India'


celery.py - 
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ECartApplication.settings')
app = Celery('ECartApplication')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

我希望电子邮件应发送5次,但会出现错误

[tasks]
  . ECartApplication.celery.debug_task
  . Sending_Emails

[2019-05-19 12:41:27,695: INFO/SpawnPoolWorker-2] child process 3628 calling sel
f.run()
[2019-05-19 12:41:27,696: INFO/SpawnPoolWorker-1] child process 5748 calling sel
f.run()
[2019-05-19 12:41:28,560: INFO/MainProcess] Connected to redis://localhost:6379/
/
[2019-05-19 12:41:30,599: INFO/MainProcess] mingle: searching for neighbors
[2019-05-19 12:41:35,035: INFO/MainProcess] mingle: all alone
[2019-05-19 12:41:39,069: WARNING/MainProcess] c:\users\vipin\appdata\local\prog
rams\python\python37-32\lib\site-packages\celery\fixups\django.py:202: UserWarni
ng: Using settings.DEBUG leads to a memory leak, never use this setting in produ
ction environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2019-05-19 12:41:39,070: INFO/MainProcess] celery@vipin-PC ready.
[2019-05-19 12:41:46,448: INFO/MainProcess] Received task: Sending_Emails[db10da
d4-a8ec-4ad2-98a6-60e8c3183dd1]
[2019-05-19 12:41:47,455: ERROR/MainProcess] Task handler raised error: ValueErr
or('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
  File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\billiard\pool.py", line 358, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\celery\app\trace.py", line 544, in _fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)

1 个答案:

答案 0 :(得分:0)

在Windows 7/10上运行Python时,这是一个问题。

有一种解决方法,您只需要使用可以使用eventlet安装的模块pip

  

pip安装事件

此后,在命令末尾用-P eventlet执行您的工作程序:

  

芹菜-MyWorker工作者-l信息-P事件