使用多处理记录

时间:2012-05-19 12:16:54

标签: python logging multiprocessing

我确实有以下logger类(作为logger.py):

import logging, logging.handlers
import config

log = logging.getLogger('myLog')

def start():
    "Function sets up the logging environment."
    log.setLevel(logging.DEBUG)
    formatter = logging.Formatter(fmt='%(asctime)s [%(levelname)s] %(message)s', datefmt='%d-%m-%y %H:%M:%S')

    if config.logfile_enable:
        filehandler = logging.handlers.RotatingFileHandler(config.logfile_name, maxBytes=config.logfile_maxsize,backupCount=config.logfile_backupCount)
        filehandler.setLevel(logging.DEBUG)
        filehandler.setFormatter(formatter)
        log.addHandler(filehandler)

    console = logging.StreamHandler()
    console.setLevel(logging.DEBUG)
    console.setFormatter(logging.Formatter('[%(levelname)s] %(message)s')) # nicer format for console
    log.addHandler(console)

    # Levels are: debug, info, warning, error, critical.
    log.debug("Started logging to %s [maxBytes: %d, backupCount: %d]" % (config.logfile_name, config.logfile_maxsize, config.logfile_backupCount))

def stop():
    "Function closes and cleans up the logging environment."
    logging.shutdown()

对于日志记录,我启动logger.start()一次,然后在任何项目文件中导入from logger import log。然后,我只需在需要时使用log.debug()log.error()。 它可以在脚本的任何地方(不同的类,函数和文件)中正常工作,但它不适用于通过多处理类进行的不同进程。

我收到以下错误:No handlers could be found for logger "myLog"

我该怎么办?

1 个答案:

答案 0 :(得分:9)

来自python docs:logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python.

请参阅:http://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes

BTW:我在这种情况下做的是使用Scribe,它是一个分布式日志记录聚合器,我通过TCP登录。这允许我将所有服务器记录到同一个地方,而不仅仅是所有进程。

查看此项目:http://pypi.python.org/pypi/ScribeHandler