我正在使用芹菜和芹菜来处理Python项目中的任务执行和计划任务。 我没有使用django 。
执行芹菜任务正在按预期工作。然而,我遇到了试图让计划任务(芹菜击败)运行的墙。
我已按照celery documentation将我的任务成功添加到 app.conf.beat_schedule 。如果我在添加任务后打印出节拍时间表,我可以看到该任务已成功添加到app.conf.beat_schedule。
from celery import Celery
from celery.task import task
# Celery init
app = Celery('tasks', broker='pyamqp://guest@localhost//')
# get the latest device reading from the appropriate provider
@app.task(bind=True, retry_backoff=True)
def get_reading(self, provider, path, device, config, location, callback):
logger.info("get_reading() called")
module = importlib.import_module('modules.%s' % provider)
try:
module.get_reading(path, device, config, location, callback)
except Exception as e:
self.retry(exc=e)
# add the periodic task
def add_get_reading_periodic_task(provider, path, device, config, location, callback, interval = 600.0):
app.conf.beat_schedule = {
"poll-provider": {
"task": "get_reading",
"schedule": interval,
"args": (provider, path, device, config, location, callback)
}
}
logger.info(app.conf.beat_schedule)
logger.info("Added task 'poll-provider' for %s to beat schedule" % provider)

查看我的应用程序日志,我可以看到app.conf.beat_schedule已使用传递给 add_get_reading_periodic_task()的数据进行了更新:
2017-08-17 11:07:13,216 - gateway - INFO - {'poll-provider': {'task': 'get_reading', 'schedule': 10, 'args': ('provider1', '/opt/provider1', None, {'location': {'lan.local': {'uri': 'http://192.168.1.10'}}}, 'lan.local', {'url': 'http://localhost:8080', 'token': '*******'})}}
2017-08-17 11:07:13,216 - gateway - INFO - Added task 'poll-provider' for provider1 to beat schedule
我在同一个应用程序文件中手动运行芹菜工人和芹菜同时(在不同的终端窗口中):
$ celery worker -A gateway --loglevel=INFO
$ celery beat -A gateway --loglevel=DEBUG
如果我在我的应用程序中调用 get_reading.delay(...),它将由芹菜工作者按预期执行。
但是,芹菜节拍过程从未显示任何已注册计划任务的迹象:
celery beat v4.0.2 (latentcall) is starting.
__ - ... __ - _
LocalTime -> 2017-08-17 11:05:15
Configuration ->
. broker -> amqp://guest:**@localhost:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2017-08-17 11:05:15,228: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-08-17 11:05:15,228: INFO/MainProcess] beat: Starting...
[2017-08-17 11:05:15,248: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: celery.backend_cleanup celery.backend_cleanup() <crontab: 0 4 * * * (m/h/d/dM/MY)>
[2017-08-17 11:05:15,248: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-08-17 11:05:15,250: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
[2017-08-17 11:10:15,351: DEBUG/MainProcess] beat: Synchronizing schedule...
[2017-08-17 11:10:15,355: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
[2017-08-17 11:15:15,400: DEBUG/MainProcess] beat: Synchronizing schedule...
[2017-08-17 11:15:15,402: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
[2017-08-17 11:20:15,502: DEBUG/MainProcess] beat: Synchronizing schedule...
[2017-08-17 11:20:15,504: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
&#13;
这似乎是通过运行 celery inspect scheduled :
来确认的-> celery@localhost.lan: OK
- empty -
我已尝试在将计划任务添加到 app.conf.beat_schedule 之前和之后启动celery beat,并且在两种情况下,计划任务都不会出现在芹菜节拍中。
我看到celery beat不支持动态重装配置,直到版本4,但我正在运行芹菜击败4.0.2
我在这里做错了什么?为什么芹菜没有表现出我的预定任务?
答案 0 :(得分:0)
您是否尝试过使用文档中描述的代码:
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 10 seconds.
sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')
答案 1 :(得分:0)
创建用于创建定期任务的模型:
class TaskScheduler(models.Model):
periodic_task = models.OneToOneField(
PeriodicTask, on_delete=models.CASCADE, blank=True, null=True)
node = models.OneToOneField(
Node, on_delete=models.CASCADE, blank=True, null=True)
@staticmethod
def schedule_every(task_name, period, every, sheduler_name, args, queue=None):
# schedules a task by name every "every" "period". So an example call would be:
# TaskScheduler('mycustomtask', 'seconds', 30, [1,2,3])
# that would schedule your custom task to run every 30 seconds with the arguments 1,2 and 3 passed to the actual task.
permissible_periods = ['days', 'hours', 'minutes', 'seconds']
if period not in permissible_periods:
raise Exception('Invalid period specified')
# create the periodic task and the interval
# create some name for the period task
ptask_name = sheduler_name
interval_schedules = IntervalSchedule.objects.filter(
period=period, every=every)
if interval_schedules: # just check if interval schedules exist like that already and reuse em
interval_schedule = interval_schedules[0]
else: # create a brand new interval schedule
interval_schedule = IntervalSchedule()
interval_schedule.every = every # should check to make sure this is a positive int
interval_schedule.period = period
interval_schedule.save()
if(queue):
ptask = PeriodicTask(name=ptask_name, task=task_name,
interval=interval_schedule, queue=queue)
else:
ptask = PeriodicTask(name=ptask_name, task=task_name,
interval=interval_schedule)
if(args):
ptask.args = args
ptask.save()
return TaskScheduler.objects.create(periodic_task=ptask)
@staticmethod
def schedule_cron(task_name, at, sheduler_name, args):
# schedules a task by name every day at the @at time.
# create some name for the period task
ptask_name = sheduler_name
crons = CrontabSchedule.objects.filter(
hour=at.hour, minute=at.minute)
if crons: # just check if CrontabSchedule exist like that already and reuse em
cron = crons[0]
else: # create a brand new CrontabSchedule
cron = CrontabSchedule()
cron.hour = at.hour
cron.minute = at.minute
cron.save()
ptask = PeriodicTask(name=ptask_name,
crontab=cron, task=task_name)
if(args):
ptask.args = args
ptask.save()
return TaskScheduler.objects.create(periodic_task=ptask)
def stop(self):
""" pauses the task """
ptask = self.periodic_task
ptask.enabled = False
ptask.save()
def start(self):
ptask = self.periodic_task
ptask.enabled = True
ptask.save()
def terminate(self):
self.stop()
ptask = self.periodic_task
PeriodicTask.objects.get(name=ptask.name).delete()
self.delete()
ptask.delete()
然后从您的代码
# give the periodic task a unique name
scheduler_name = "%s_%s" % ('node_heartbeat', str(node.pk))
# organize the task arguments
args = json.dumps([node.extern_id, node.pk])
# create the periodic task in heartbeat queue
task_scheduler = TaskScheduler().schedule_every(
'core.tasks.pdb_heartbeat', 'seconds', 15, scheduler_name , args, 'heartbeat')
task_scheduler.node = node
task_scheduler.save()
我很久以前在这里遇到的原始班级,我已经添加了schedule_cron