我是celery的新手,正在使用Celery运行异步任务。
Celery项目的例子对我没什么帮助。谁能指点我一些有用的例子?
答案 0 :(得分:6)
要将MongoDB用作后端存储,您必须显式配置Celery以使用MongoDB作为后端。
http://docs.celeryproject.org/en/latest/getting-started/brokers/mongodb.html#broker-mongodb
正如您所说,文档没有显示完整的工作示例。我刚开始玩Celery,但一直在使用MongoDB。我使用MongoDB和Celery http://skillachie.com/?p=953
创建了一个简短的工作教程然而,这些片段应该包含使用Celery和MongoDB进行hello世界所需的一切
celeryconfig.py
from celery.schedules import crontab
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
"host": "127.0.0.1",
"port": 27017,
"database": "jobs",
"taskmeta_collection": "stock_taskmeta_collection",
}
#used to schedule tasks periodically and passing optional arguments
#Can be very useful. Celery does not seem to support scheduled task but only periodic
CELERYBEAT_SCHEDULE = {
'every-minute': {
'task': 'tasks.add',
'schedule': crontab(minute='*/1'),
'args': (1,2),
},
}
tasks.py
from celery import Celery
import time
#Specify mongodb host and datababse to connect to
BROKER_URL = 'mongodb://localhost:27017/jobs'
celery = Celery('EOD_TASKS',broker=BROKER_URL)
#Loads settings for Backend to store results of jobs
celery.config_from_object('celeryconfig')
@celery.task
def add(x, y):
time.sleep(30)
return x + y
答案 1 :(得分:0)
我一直在测试RabbitMQ作为代理,将MongoDB作为后端,将MongoDB作为代理和后端。这些是我的发现。我希望他们能帮助某个人。
假设:您已在默认设置下运行MongoDB(本地主机:21017) 使用conda设置环境(您可以使用任何软件包管理器)
conda update -n base conda -c anaconda
conda create -n apps python=3.6 pymongo
conda install -n apps -c conda-forge celery
conda activate apps
更新我的conda,创建一个名为apps的环境,并安装pymongo和celery。
RabbitMQ作为代理,MongoDB作为后端
sudo apt install rabbitmq-server
sudo service rabbitmq-server restart
sudo rabbitmqctl status
如果没有错误,则Rabbitmq退出运行。让我们在executor.py中创建任务,并在Runner.py中调用它们
# executor.py
import time
from celery import Celery
BROKER_URL = 'amqp://localhost//'
BACKEND_URL = 'mongodb://localhost:27017/from_celery'
app = Celery('executor', broker=BROKER_URL, backend=BACKEND_URL)
@app.task
def pizza_bot(string:str, snooze=10):
'''return a dictionary with bot and
lower case string input
'''
print(f'Pretending to be working {snooze} seconds')
time.sleep(snooze)
return {'bot':string.lower()}
我们在Runner.py中称呼他们
# runner.py
import time
from datetime import datetime
from executor import pizza_bot
def run_pizza(msg:str, use_celery:bool=True):
start_time = datetime.now()
if use_celery: # Using celery
response = pizza_bot.delay(msg)
else: # Not using celery
response = pizza_bot(msg)
print(f'It took {datetime.now()-start_time}!'
' to run')
print(f'response: {response}')
return response
if __name__ == '__main__':
# Call using celery
response = run_pizza('This finishes extra fast')
while not response.ready():
print(f'[Waiting] It is {response.ready()} that we have results')
time.sleep(2) # sleep to second
print('\n We got results:')
print(response.result)
在A终端上运行芹菜:
cd path_to_our_python_files
celery -A executor.app worker --loglevel=info
这仅在开发中完成。我想看看背景发生了什么。在生产中,请在守护进程中运行它。
在终端B上运行Runner.py:
cd path_to_our_python_files
conda activate apps
python runner.py
在终端A中,您将看到任务已接收,并且在暂停秒钟内将完成任务。在您的MongoDB上,您将看到一个名为from_celery的新集合,其中包含消息和结果。
MongoDB作为代理和后端
需要一个简单的修改来设置它。如上所述,我必须创建一个配置文件来设置MongoDB后端设置。
#mongo_config.py
#Backend Settings
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
"host": "localhost",
"port": 27017,
"database": "celery",
"taskmeta_collection": "pizza_collection",
}
让我们创建executor_updated.py与executor.py几乎相同,但是代理现在是MongoDB,并通过 config_from_object
添加后端# executor_updated.py
import time
from celery import Celery
BROKER_URL = 'mongodb://localhost:27017/celery'
app = Celery('executor_updated',broker=BROKER_URL)
#Load Backend Settings
app.config_from_object('mongo_config')
@app.task
def pizza_bot(string:str, snooze=10):
'''return a dictionary with bot and
lower case string input
'''
print(f'Pretending to be working {snooze} seconds')
time.sleep(snooze)
return {'bot':string.lower()}
在终端C上运行芹菜:
cd path_to_our_python_files
celery -A executor_updated.app worker --loglevel=info
在终端D上运行Runner.py:
cd path_to_our_python_files
conda activate apps
python runner.py
现在,我们既拥有MongoDB作为代理,又拥有后端。在MongoDB中,您将看到一个名为 celery 的集合和一个表 pizza_collection
希望这有助于您开始使用这些超赞的工具。
更新:我添加了一个GitHub repo和一个玩具网抓取示例,该示例使用celery安排运行并将数据保存到MongoDB:Advance_Scraping