python芹菜工人似乎正在产生额外的过程

时间:2015-06-08 13:17:45

标签: python celery

我运行python celery项目,遵循以下文档:http://celery.readthedocs.org/

布局:

celery_folder/celeryServer.py       # server to handle http requests
celery_folder/celeryconfig.py       # config file
celery_folder/start-celery.sh       # shell script to start the workers

celery_folder/myCelery/celery.py    # the celery app
celery_folder/myCelery/tasks.py     # function definitions

我可以使用常用命令启动芹菜工作者:

celery worker --app=<APP> --concurrency=1 --loglevel=info --queues=remote

当我这样做时,我得到2个在htop中生成的新进程:

-bash
  -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote
      -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote

无论其

当我添加配置文件(--config=celeryconfig.py)时,我会生成4个进程,如下所示:

--bash
  -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote
      -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote
          -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote
      -->/usr/bin/python /usr/local/bin/celery worker --app=myCelery --concurrency=1 --loglevel=info --queues=remote

为什么我会获得这些额外的流程。他们在做什么?

这是配置文件:

CELERY_TASK_RESULT_EXPIRES  = 3600
CELERY_RESULT_BACKEND       = 'amqp'
CELERY_CONCURRENCY          = 1
CELERY_ROUTES               = {'myCelery.tasks.home_task':{'queue':'remote'}}
CELERYD_AUTOSCALER          = 'celery.worker.autoscale:Autoscaler'
CELERY_IMPORT               = ('celery.task.http')
CELERY_INCLUDE              = 'myCelery.tasks'

编辑:这是celery.py和tasks.py

projectFolder / myCelery / celery.py

from __future__ import absolute_import
from celery import Celery

app = Celery()
app.config_from_object('celeryconfig')

if __name__ == '__main__':
    app.start()

projct / myCelery / tasks.py

from __future__ import absolute_import
from myCelery.celery import app

import os
import json
import requests
import datetime as dt
import dateutil.parser

from myCelery.<GITHUB REPO> import <FUNCTION FROM REPO>

def login():
    <CUSTOM FUNCTION REDACTED>

def write_token_to_file(data, headers):
    <CUSTMO FUNCTION REDACTED>

def get_headers():
    <REDACTED>

@app.task
def add(imei):
    <REDACTED>       

@app.task
def remote_task(imei):
    <REDACTED>

1 个答案:

答案 0 :(得分:0)

由于芹菜使用&#34; main&#34;与游泳池交谈的过程&#39; processses。

 celery -A python worker -c 1 -l debug

...

vagrant@vagrant:~$ ps aux | grep celery
vagrant  18518 18.2  2.2 111748 44624 pts/2    S+   04:05   0:00 [celeryd: celery@vagrant:MainProcess] -active- (-A python worker -c 1 -l debug)
vagrant  18522  0.0  1.9 104252 39020 pts/2    S+   04:05   0:00 [celeryd:celery@vagrant:PoolWorker-1]

两名工人:

celery -A python worker -c 2 -l debug

...

vagrant@vagrant:~$ ps aux | grep celery
vagrant  18525 19.2  2.2 111840 44732 pts/2    S+   04:07   0:00 [celeryd: celery@vagrant:MainProcess] -active- (-A python worker -c 2 -l debug)
vagrant  18530  0.0  1.9 104252 39020 pts/2    S+   04:07   0:00 [celeryd: celery@vagrant:PoolWorker-1]
vagrant  18531  0.0  1.9 104252 39080 pts/2    S+   04:07   0:00 [celeryd: celery@vagrant:PoolWorker-2]