芹菜状态陷入待处理状态

时间:2017-10-26 21:17:51

标签: celery django-celery celery-task celerybeat celeryd

我找到了一个以mongodb作为后端结果运行芹菜的示例original code example。在他的例子中,他有CELERYBEAT_SCHEDULE,每分钟运行一些参数,在我的情况下,我只是注释掉了这段代码。在我的情况下,我只想在收到任务后立即执行任务。从工作日志中我甚至看不到收到的任务,result.status的输出是PENDING。为什么它处于待处理状态而未完成任务。它是一个简单的添加任务,因此我无法想象它需要很长时间。

另一件事是我有虚拟环境所以从告诉我的是我应该像这样运行芹菜&#34; 芹菜多开始工作者--loglevel = info &#34; < / p>

我是芹菜的新手,这对我来说有点困惑。提前感谢您的帮助。

celeryconfig.py文件

# from celery.schedules import crontab

CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
    "host": "127.0.0.1",
    "port": 27017,
    "database": "jobs", 
    "taskmeta_collection": "stock_taskmeta_collection",
}

# this was part of the original code but i commented out in hopes 
# it would run the task right away and not delay.
#
#used to schedule tasks periodically and passing optional arguments 
#Can be very useful. Celery does not seem to support scheduled task but only periodic
# CELERYBEAT_SCHEDULE = {
#     'every-minute': {
#         'task': 'tasks.add',
#         'schedule': crontab(minute='*/1'),
#         'args': (1,2),
#     },
# }

tasks.py文件

from celery import Celery
import time 

#Specify mongodb host and datababse to connect to
BROKER_URL = 'mongodb://localhost:27017/jobs'

celery = Celery('EOD_TASKS',broker=BROKER_URL)

#Loads settings for Backend to store results of jobs 
celery.config_from_object('celeryconfig')

@celery.task
def add(x, y):
    time.sleep(5)
    return x + y



# starting celery
celery multi start worker --loglevel=info
celery multi v4.1.0 (latentcall)
> Starting nodes...
    > worker@lnx-v2: OK

运行芹菜任务

lnx-v2:171> python
Python 3.4.1 (default, Nov 12 2014, 13:34:48) 
[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from tasks import add
>>> result = add.delay(1,1)
>>> result
<AsyncResult: 8e6ee263-d8a4-4b17-8d7a-9873b6c98473>
>>> result.status
'PENDING'

工作日志

lnx-v2:208> tail -f worker.log
[2017-10-26 13:41:15,658: INFO/MainProcess] mingle: all alone
[2017-10-26 13:41:15,683: INFO/MainProcess] worker@lnx-v2 ready.
[2017-10-26 13:45:50,465: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-26 13:45:50,487: INFO/MainProcess] mingle: searching for neighbors
[2017-10-26 13:45:51,522: INFO/MainProcess] mingle: all alone
[2017-10-26 13:45:51,540: INFO/MainProcess] worker@lnx-v2 ready.
[2017-10-26 13:47:13,169: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-26 13:47:13,191: INFO/MainProcess] mingle: searching for neighbors
[2017-10-26 13:47:14,228: INFO/MainProcess] mingle: all alone
[2017-10-26 13:47:14,254: INFO/MainProcess] worker@lnx-v2 ready.


# Celery process
lnx-v2:209> ps -ef | grep celery
15096     1  0 13:47 ?        00:00:00 [celeryd: worker@lnx-v2:MainProcess] -active- (worker --loglevel=info --logfile=worker%I.log --pidfile=worker.pid --hostname=worker@lnx-v2)
15157 15096  0 13:47 ?        00:00:00 [celeryd: worker@lnx-v2:ForkPoolWorker-1]

1 个答案:

答案 0 :(得分:0)

查看以下代码中的celery任务中是否列出了add方法

celery.tasks.keys()

我认为你必须用括号

关闭装饰器
@celery.task()
def add(x, y):
    time.sleep(5)
    return x + y