我的问题与Celery not queuing tasks to broker on remote server, adds tasks to localhost instead一样,但是答案对我不起作用。
我的celery.py
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project', broker='amqp://<user>:<user_pass>@remoteserver:5672/<vhost>', backend='amqp')
# app = Celery('project')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
当我跑步时:
$芹菜-工人-l信息
我收到以下输出:
-------------- celery@paulo-Inspiron-3420 v4.2.1 (windowlicker)
---- **** -----
--- * *** * -- Linux-4.15.0-36-generic-x86_64-with-Ubuntu-18.04-bionic 2018-10-30 13:44:07
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: mycelery:0x7ff88ca043c8
- ** ---------- .> transport: amqp://<user>:**@<remote_ip>:5672/<vhost>
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
我尝试停止rabbitmq服务器并也将其卸载,但是celery一直排队到本地主机。
有人可以帮忙吗?
答案 0 :(得分:1)
您需要在celery.py文件所在的目录中的__init__.py
文件中添加类似内容:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
还要确保从项目的virtualenv内部启动工作进程。