我正在使用Django/Celery Quickstart... or, how I learned to stop using cron and love celery,似乎作业正在排队,但从未运行过。
tasks.py:
from celery.task.schedules import crontab
from celery.decorators import periodic_task
# this will run every minute, see http://celeryproject.org/docs/reference/celery.task.schedules.html#celery.task.schedules.crontab
@periodic_task(run_every=crontab(hour="*", minute="*", day_of_week="*"))
def test():
print "firing test task"
所以我跑芹菜:
bash-3.2$ sudo manage.py celeryd -v 2 -B -s celery -E -l INFO
/scratch/software/python/lib/celery/apps/worker.py:166: RuntimeWarning: Running celeryd with superuser privileges is discouraged!
'Running celeryd with superuser privileges is discouraged!'))
-------------- celery@myserver v3.0.12 (Chiastic Slide)
---- **** -----
--- * *** * -- [Configuration]
-- * - **** --- . broker: django://localhost//
- ** ---------- . app: default:0x12120290 (djcelery.loaders.DjangoLoader)
- ** ---------- . concurrency: 2 (processes)
- ** ---------- . events: ON
- ** ----------
- *** --- * --- [Queues]
-- ******* ---- . celery: exchange:celery(direct) binding:celery
--- ***** -----
[Tasks]
. GotPatch.tasks.test
[2012-12-12 11:58:37,118: INFO/Beat] Celerybeat: Starting...
[2012-12-12 11:58:37,163: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 11:58:37,249: WARNING/MainProcess] /scratch/software/python/lib/djcelery/loaders.py:132: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn("Using settings.DEBUG leads to a memory leak, never "
[2012-12-12 11:58:37,348: WARNING/MainProcess] celery@myserver ready.
[2012-12-12 11:58:37,352: INFO/MainProcess] consumer: Connected to django://localhost//.
[2012-12-12 11:58:37,700: INFO/MainProcess] child process calling self.run()
[2012-12-12 11:58:37,857: INFO/MainProcess] child process calling self.run()
[2012-12-12 11:59:00,229: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:00:00,017: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:01:00,020: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:02:00,024: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
任务确实排队等待:
python manage.py shell
>>> from kombu.transport.django.models import Message
>>> Message.objects.count()
234
计数随着时间的推移而增加:
>>> Message.objects.count()
477
日志文件中没有任何行似乎表明正在执行任务。我期待的是:
[... INFO/MainProcess] Task myapp.tasks.test[39d57f82-fdd2-406a-ad5f-50b0e30a6492] succeeded in 0.00423407554626s: None
有关如何诊断/调试此建议的任何建议吗?
答案 0 :(得分:0)
我也是celery的新手,但是根据您提供的链接上的评论,看起来教程中出现了错误。其中一条评论指出:
在此命令
sudo ./manage.py celeryd -v 2 -B -s celery -E -l INFO
您必须添加“-I tasks”才能加载tasks.py文件...
你试过吗?
答案 1 :(得分:0)
您应该检查是否在django的settyngs.py中指定了BROKER_URL参数。
BROKER_URL = 'django://'
你应该检查django,mysql和芹菜中的时区是否相等。 它帮助了我。
P.S:
[... INFO/MainProcess] Task myapp.tasks.test[39d57f82-fdd2-406a-ad5f-50b0e30a6492] succeeded in 0.00423407554626s: None
此行表示您的任务已安排(!未执行!)
请检查您的配置,我希望它可以帮助您。
答案 2 :(得分:0)
我希望有人可以借鉴我的黑客攻击经验。
根据教程设置完所有内容后,我注意到,当我打电话时
add.delay(4,5)
没有任何反应。工人没有收到任务(没有在stderr上打印)。
问题出在rabbitmq安装上。事实证明,默认的可用磁盘大小要求是1GB,这对我的VM来说太过分了。
让我走上正轨的是阅读rabbitmq日志文件。 找到它我必须停止并启动rabbitmq服务器sudo rabbitmqctl stop
sudo rabbitmq-server
rabbitmq将日志文件位置转储到屏幕上。在文件中我注意到了这一点:
=WARNING REPORT==== 14-Mar-2017::13:57:41 ===
disk resource limit alarm set on node rabbit@supporttip.
**********************************************************
*** Publishers will be blocked until this alarm clears ***
**********************************************************
然后我按照这里的说明来减少可用磁盘限制 Rabbitmq ignores configuration on Ubuntu 12
作为基线,我使用了git中的配置文件 https://github.com/rabbitmq/rabbitmq-server/blob/stable/docs/rabbitmq.config.example
改变本身:
{disk_free_limit, "50MB"}