django将自定义记录器添加到芹菜日志中

时间:2015-09-18 09:08:19

标签: python django logging celery

我在我的django应用程序中添加了一个自定义日志记录处理程序,用于将日志条目写入数据库。

class DbLogHandler(logging.Handler): # Inherit from logging.Handler
    def __init__(self):
        # run the regular Handler __init__
        logging.Handler.__init__(self)
        self.entries = []
        logging.debug("*****************[DB] INIT db handler")

    def emit(self, record):
        # instantiate the model
        logging.debug("*****************[DB] called emit on db handler")
        try:
            revision_instance = getattr(record, 'revision', None)
            logEntry = MyModel(name=record.name,
                                  log_level_name=record.levelname,
                                  message = record.msg,
                                  module = record.module,
                                  func_name = record.funcName,
                                  line_no = record.lineno,
                                  exception = record.exc_text,
                                  revision = revision_instance
                                  )
            if revision_instance is None:
                return
            self.entries.append(logEntry)

        except Exception as ex:
            print(ex)
        return

    def flush(self):
        if self.entries:
            MyModel.objects.bulk_create(self.entries)
            logging.info("[+] Successfully flushed {0:d} log entries to "
                         "the DB".format(len(self.entries)))
        else:
            logging.info("[*] No log entries for DB logger")

当我直接调用函数时,假设通过运行管理命令,正确使用了处理程序。然而,在生产中,切入点将是芹菜任务。我的理解是芹菜有它自己的记录机制。我正在尝试但无法开始工作的是将我的数据库处理程序添加到芹菜日志记录中。也就是说,所有芹菜日志也将被发送到DbLogHandler

这就是我试图完成它的方式。在my_app.celery_logging.logger

from celery.utils.log import get_task_logger

class CeleryAdapter(logging.LoggerAdapter):
    """Adapter to add current task context to "extra" log fields."""
    def process(self, msg, kwargs):
        if not celery.current_task:
            return msg, kwargs

        kwargs = kwargs.copy()
        kwargs.setdefault('extra', {})['celery'] = \
            vars(celery.current_task.request)
        return msg, kwargs

def task_logger(name):
    """
    Return a custom celery task logger that will also log to db.

    We need to add the db handler explicitly otherwise it is not picked
    up by celery.

    Also, we wrap the logger in a CeleryAdapter to provide some extra celery-
    related context to the logging messages.

    """
    # first get the default celery task logger
    log = get_task_logger(name)

    # if available, add the db-log handler explicitly to the celery task
    # logger
    handlers = settings.LOGGING.get('handlers', [])
    if handlers:
        db_handler_dict = handlers.get('db', None)
        if (db_handler_dict != settings.NULL_HANDLER_PARAMS and
                 db_handler_dict is not None):
            db_handler = {'db': {'class': 'my_app.db_logging.db_logger.DbLogHandler',
                                   'formatter': 'verbose',
                                   'level': 'DEBUG'}}
            log.addHandler(db_handler)

    # wrap the logger by the CeleryAdapter to add some celery specific
    # context to the logs
    return CeleryAdapter(log, {}) 

然后,最后在我的task.py

from my_app.celery_logging.logger import task_logger
logger = task_logger(__name__)

但从这一点来看,这是一个充满痛苦的世界。我甚至无法描述究竟发生了什么。当我启动服务器并查看芹菜日志输出时,我看到我的db-logger实际上正在被调用,但芹菜似乎松散了工人。

[2015-09-18 10:30:57,158: INFO/MainProcess] [*] No log entries for DB logger
Raven is not configured (logging is disabled). Please see the documentation for more information.
2015-09-18 10:30:58,659 raven.contrib.django.client.DjangoClient INFO Raven is not configured (logging is disabled). Please see the documentation for more information.
[2015-09-18 10:30:59,155: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2015-09-18 10:30:59,157: DEBUG/MainProcess] | Worker: Building graph...
[2015-09-18 10:30:59,158: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Queues (intra), Pool, Autoscaler, Autoreloader, StateDB, Beat, Consumer}
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Building graph...
[2015-09-18 10:30:59,164: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Tasks, Control, Gossip, Agent, Heart, event loop}
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Hub
[2015-09-18 10:30:59,167: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Pool
[2015-09-18 10:30:59,173: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,173: DEBUG/MainProcess] | Worker: Starting Consumer
[2015-09-18 10:30:59,174: DEBUG/MainProcess] | Consumer: Starting Connection
[2015-09-18 10:30:59,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2015-09-18 10:30:59,180: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,180: DEBUG/MainProcess] | Consumer: Starting Events
[2015-09-18 10:30:59,188: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,188: DEBUG/MainProcess] | Consumer: Starting Mingle
[2015-09-18 10:30:59,188: INFO/MainProcess] mingle: searching for neighbors
[2015-09-18 10:31:00,196: INFO/MainProcess] mingle: all alone
[2015-09-18 10:31:00,196: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,197: DEBUG/MainProcess] | Consumer: Starting Tasks
[2015-09-18 10:31:00,203: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,204: DEBUG/MainProcess] | Consumer: Starting Control
[2015-09-18 10:31:00,207: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,208: DEBUG/MainProcess] | Consumer: Starting Gossip
[2015-09-18 10:31:00,211: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,211: DEBUG/MainProcess] | Consumer: Starting Heart
[2015-09-18 10:31:00,212: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,212: DEBUG/MainProcess] | Consumer: Starting event loop
[2015-09-18 10:31:00,213: WARNING/MainProcess] celery@vagrant-base-precise-amd64 ready.
[2015-09-18 10:31:00,213: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2015-09-18 10:31:00,255: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/__init__.py", line 206, in start
    self.blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 374, in start
    return self.obj.start()
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 278, in start
    blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 821, in start
    c.loop(*c.loop_args())
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/loops.py", line 48, in asynloop
    raise WorkerLostError('Could not start worker processes')

在调用芹菜任务时,我也看不到任何日志了。

1 个答案:

答案 0 :(得分:0)

在配置中将worker_hijack_root_logger设置为False,并自定义您的记录器

link