如何避免Python中的日志文件过大?

时间:2019-04-28 06:47:31

标签: python python-3.x exception logging

我的脚本每5秒执行一次。错误记录到文件中。这意味着,如果出现错误,日志文件每5秒就会因同一错误而肿

while True:
    try:
        i_might_fail()   # As long as this line fails...
    except Exception as ex:
        logger.error(ex) # ... the log file gets bloated
    time.sleep(5)

不可能终止脚本。它必须每5秒重试一次。

我正在寻找一种日志记录功能,以在x分钟内忽略相同的异常

logger.ignore_duplicates_for(10, 'minutes')

有什么主意吗? 预先感谢!

1 个答案:

答案 0 :(得分:2)

此功能可以通过以下方式实现:

import logging
import time
import datetime

logger = logging.getLogger(__file__)


TIMEDELTA = datetime.timedelta(seconds=5)


def error_without_duplicates(self, msg, *args, **kwargs):
    if not hasattr(self, 'msg_cache'):
        self.msg_cache = {}
    str_msg = str(msg)
    now = datetime.datetime.utcnow()
    if str_msg not in self.msg_cache:
        self.error(msg, *args, **kwargs)
        self.msg_cache[str_msg] = now
    elif now - self.msg_cache[str_msg] > TIMEDELTA:
        self.error(msg, *args, **kwargs)
        self.msg_cache[str_msg] = now


logging.Logger.error_without_duplicates = error_without_duplicates


while True:
    try:
        a = 1 /0
    except Exception as ex:
        logger.error_without_duplicates(ex)  # every 5 seconds, not 1
    time.sleep(1)