我是Celery
和SQS
的新手,并希望用它来定期检查SQS
中存储的消息,然后解雇消费者。消费者和Celery
都位于EC2
,而消息是使用GAE
库从boto
发送的。
目前,我很困惑:
creating_msg_gae.py
的邮件正文中,我应该放置哪些task
信息?我认为这些信息将是我的celery task
?creating_msg_gae.py
的邮件正文中,url
被视为我的消费者(function do_something_url(url)
中的tasks.py
)要处理的参数?celery worker -A celery_c -l info
运行芹菜,似乎芹菜定期检查SQS
。我是否需要在PeriodicTask
中创建Celery
?我非常感谢有任何建议可以帮助我解决这个问题。
from boto import sqs
conn = sqs.connect_to_region("us-east-1",
aws_access_key_id='aaa',
aws_secret_access_key='bbb')
my_queue = conn.get_queue('uber_batch')
msg = {'properties': {'content_type': 'application/json',
'content_encoding': 'utf-8',
'body_encoding':'base64',
'delivery_tag':None,
'delivery_info': {'exchange':None, 'routing_key':None}},}
body = {'id':'theid',
###########Question 1#######
'task':'what task name I should put here?',
'url':['my_s3_address']}
msg.update({'body':base64.encodestring(json.dumps(body))})
my_queue.write(my_queue.new_message(json.dumps(msg)))
我的Celery文件系统如下:
./ce_folder/
celery_c.py, celeryconfig.py, tasks.py, __init__.py
import os
BROKER_BACKEND = "SQS"
AWS_ACCESS_KEY_ID = 'aaa'
AWS_SECRET_ACCESS_KEY = 'bbb'
os.environ.setdefault("AWS_ACCESS_KEY_ID", AWS_ACCESS_KEY_ID)
os.environ.setdefault("AWS_SECRET_ACCESS_KEY", AWS_SECRET_ACCESS_KEY)
BROKER_URL = 'sqs://'
BROKER_TRANSPORT_OPTIONS = {'region': 'us-east-1'}
BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 60}
BROKER_TRANSPORT_OPTIONS = {'polling_interval': 30}
CELERY_DEFAULT_QUEUE = 'uber_batch'
CELERY_DEFAULT_EXCHANGE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_EXCHANGE_TYPE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_ROUTING_KEY = CELERY_DEFAULT_QUEUE
CELERY_QUEUES = {
CELERY_DEFAULT_QUEUE: {
'exchange': CELERY_DEFAULT_QUEUE,
'binding_key': CELERY_DEFAULT_QUEUE,
}
}
from __future__ import absolute_import
from celery import Celery
app = Celery('uber')
app.config_from_object('celeryconfig')
if __name__ == '__main__':
app.start()
from __future__ import absolute_import
from celery_c import app
@app.task
def do_something_url(url):
..download file from url
..do some calculations
..upload results files to s3 and return the result url###
return result_url