我正在尝试链接两个使用HttpDispatchTask的任务,每个任务都有自己的URL并执行不同的操作。
我不确定问题是我将URL参数传递给子任务的方式,还是我尝试在结果后端收集结果的方式......我只需要一些指导。
我的设置如下:
我定义了我的队列(在模块cel.queues.bueller中)
from __future__ import absolute_import
from celery import Celery
celery = Celery('bueller', broker="mongodb://localhost:27017/tasks", include=['celery.task.http'])
celery.conf.update(
CELERY_TASK_RESULT_EXPIRES=600,
CELERYD_CONCURRENCY=20,
CELERY_RESULT_BACKEND = "mongodb",
CELERY_MONGODB_BACKEND_SETTINGS = {
"host": "localhost",
"port": 27017,
"database": "results",
"taskmeta_collection": "taskmeta_collection",
}
)
我在链中调用两个任务
#!/usr/bin/env python
from cel.queues.bueller import celery
from celery.task.http import HttpDispatchTask
from celery import chain
# task_a calls an audio processor
task_a = HttpDispatchTask.subtask(
None,
{
'url': 'http://example.com/operations/audio/update?format=json',
'method': 'GET'
},
options={'queue': 'bueller'}
)
# task_b calls a video processor
task_b = HttpDispatchTask.subtask(
None,
{
'url': 'http://example.com/operations/video/update?format=json',
'method': 'GET'
},
options={'queue': 'bueller'}
)
res = chain(task_a, task_b).apply_async()
print list(res.collect())
我工作人员的堆栈跟踪是:
Traceback (most recent call last):
File "./task_runner.py", line 23, in <module>
print list(res.collect())
File "/Users/bueller/.virtualenvs/bueller-queue/lib/python2.7/site-packages/celery/result.py", line 142, in collect
yield R, R.get(**kwargs)
File "/Users/bueller/.virtualenvs/bueller-queue/lib/python2.7/site-packages/celery/result.py", line 108, in get
interval=interval)
File "/Users/bueller/.virtualenvs/bueller-queue/lib/python2.7/site-packages/celery/backends/base.py", line 185, in wait_for
raise result
TypeError: run() got multiple values for keyword argument 'url'
谢谢,非常感谢任何帮助。