我有一个用例,我需要启动芹菜工作者,以便他们使用独特的队列,我试图像下面这样实现。
from celery import Celery
app = Celery(broker='redis://localhost:9555/0')
@app.task
def afunction(arg1=None, arg2=None, arg3=None):
if arg1 == 'awesome_1':
return "First type of Queue executed"
if arg2 == "awesome_2":
return "Second Type of Queue executed"
if arg3 == "awesome_3":
return "Third Type of Queue executed"
if __name__=='__main__':
qlist = ["awesome_1", "awesome_2", "awesome_3"]
arglist = [None, None, None]
for q in qlist:
arglist[qlist.index(q)] = q
argv = [
'worker',
'--detach',
'--queue={0}'.format(q),
'--concurrency=1',
'-E',
'--loglevel=INFO'
]
app.worker_main(argv)
afunction.apply_async(args=[arglist[0], arglist[1], arglist[2]], queue=q)
执行时此代码提供以下输出:
[2018-02-08 11:28:43,479: INFO/MainProcess] Connected to redis://localhost:9555/0
[2018-02-08 11:28:43,486: INFO/MainProcess] mingle: searching for neighbors
[2018-02-08 11:28:44,503: INFO/MainProcess] mingle: all alone
[2018-02-08 11:28:44,527: INFO/MainProcess] celery@SYSTEM ready.
[2018-02-08 11:28:44,612: INFO/MainProcess] Received task: __main__.afunction[f092f721-6523-4055-98fc-158ac316f4cc]
[2018-02-08 11:28:44,618: INFO/ForkPoolWorker-1] Task __main__.afunction[f092f721-6523-4055-98fc-158ac316f4cc] succeeded in 0.0010992150055244565s: 'First type of Queue executed'
因此,我可以看到工作人员在for
循环的第一次迭代时应该按原样执行,但是它会在那里停止,并且不会继续执行for循环。
我相信这种情况正在发生,因为工作者没有运行分离,或者作为脚本的子进程,因为我可以看到1 +尽可能多的进程,python在ps aux
上运行相同的脚本,因为--concurrency
是被设定。在return
for
循环继续迭代之后,任何有关错误或如何使工作队列运行的指针都会分离。
答案 0 :(得分:0)
我尝试执行以下解决方法,尽管我不确定代码的基础结构影响,但仍能按预期得出结果。如果有人可以评论是否有更好的方法来解决问题,那将是很好的,但是现在我正在使用这个解决方案。
from celery import Celery
import os
import time
app = Celery('app', broker='redis://localhost:9555/0')
@app.task
def afunction(arg1=None, arg2=None, arg3=None):
if arg1 == 'awesome_1':
return "First type of Queue executed"
if arg2 == "awesome_2":
return "Second Type of Queue executed"
if arg3 == "awesome_3":
return "Third Type of Queue executed"
qlist = ["awesome_1", "awesome_2", "awesome_3"]
arglist = [None, None, None]
for q in qlist:
os.system('nohup celery worker -A app.celery -Q {0} --loglevel=INFO --concurrency=1 &'.format(q))
os.system('echo \'\\n\'')
time.sleep(5)
for q in qlist:
arglist = [None, None, None]
arglist[qlist.index(q)] = q
afunction.apply_async(args=[arglist[0], arglist[1], arglist[2]], queue=q)
使用以下输出创建了一个nohup.out文件:
[2018-02-08 17:15:53,269: INFO/MainProcess] Connected to redis://localhost:9555/0
[2018-02-08 17:15:53,272: INFO/MainProcess] Connected to redis://localhost:9555/0
[2018-02-08 17:15:53,274: INFO/MainProcess] Connected to redis://localhost:9555/0
[2018-02-08 17:15:53,277: INFO/MainProcess] mingle: searching for neighbors
[2018-02-08 17:15:53,280: INFO/MainProcess] mingle: searching for neighbors
[2018-02-08 17:15:53,280: INFO/MainProcess] mingle: searching for neighbors
[2018-02-08 17:15:54,293: INFO/MainProcess] mingle: all alone
[2018-02-08 17:15:54,295: INFO/MainProcess] mingle: all alone
[2018-02-08 17:15:54,296: INFO/MainProcess] mingle: all alone
[2018-02-08 17:15:54,304: INFO/MainProcess] celery@SYSTEM ready.
[2018-02-08 17:15:54,304: INFO/MainProcess] celery@SYSTEM ready.
[2018-02-08 17:15:54,306: INFO/MainProcess] celery@SYSTEM ready.
[2018-02-08 17:15:57,975: INFO/MainProcess] Received task: app.afunction[e825444d-e123-4f55-9365-f36f95d62734]
[2018-02-08 17:15:57,976: INFO/ForkPoolWorker-1] Task app.afunction[e825444d-e123-4f55-9365-f36f95d62734] succeeded in 0.0003634110325947404s: 'First type of Queue executed'
[2018-02-08 17:15:57,976: INFO/MainProcess] Received task: app.afunction[80816d50-5680-4373-8b5e-dac2ae2a3ff9]
[2018-02-08 17:15:57,977: INFO/MainProcess] Received task: app.afunction[0e88c758-3010-4d37-bda2-6a9a9a02bedf]
[2018-02-08 17:15:57,977: INFO/ForkPoolWorker-1] Task app.afunction[80816d50-5680-4373-8b5e-dac2ae2a3ff9] succeeded in 0.0003187900292687118s: 'Second Type of Queue executed'
[2018-02-08 17:15:57,978: INFO/ForkPoolWorker-1] Task app.afunction[0e88c758-3010-4d37-bda2-6a9a9a02bedf] succeeded in 0.00042019598186016083s: 'Third type of queue executed'