自动删除旧的Python日志文件

时间:2017-05-12 22:18:18

标签: python logging

我有一个每天运行的Python程序。我使用带有FileHandler的logging模块将日志写入文件。我希望每个运行的日志都在其自己的文件中,并带有时间戳。但是,我想删除旧文件(比如说> 3个月)以避免填满磁盘。

我已经查看了RotatingFileHandler和TimedRotatingFileHandler,但我不希望将单个运行的日志拆分为多个文件,即使单次运行需要数天。是否有内置方法?

4 个答案:

答案 0 :(得分:2)

import logging
import time
from logging.handlers import RotatingFileHandler

logFile = 'test-' + time.strftime("%Y%m%d-%H%M%S")+ '.log'

logger = logging.getLogger('my_logger')
handler = RotatingFileHandler(logFile, mode='a', maxBytes=50*1024*1024, 
                                 backupCount=5, encoding=None, delay=False)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)

for _ in range(10000):
    logger.debug("Hello, world!")

答案 1 :(得分:1)

正如@MartijnPieters在this question中所建议的那样,您可以轻松扩展FileHandler类以处理自己的删除逻辑。 例如,我的班级将仅保留最后的“ backup_count”个文件。

import os
import re
import datetime
import logging 
from itertools import islice


class TimedPatternFileHandler(logging.FileHandler):
    """File handler that uses the current time fo the log filename,
    by formating the current datetime, according to filename_pattern, using
    the strftime function.

    If backup_count is non-zero, then older filenames that match the base
    filename are deleted to only leave the backup_count most recent copies,
    whenever opening a new log file with a different name.

    """

    def __init__(self, filename_pattern, mode, backup_count):
        self.filename_pattern = os.path.abspath(filename_pattern)
        self.backup_count = backup_count
        self.filename = datetime.datetime.now().strftime(self.filename_pattern)


        delete = islice(self._matching_files(), self.backup_count, None)
        for entry in delete:
            # print(entry.path)
            os.remove(entry.path)
        super().__init__(filename=self.filename, mode=mode)

    @property
    def filename(self):
        """Generate the 'current' filename to open"""
        # use the start of *this* interval, not the next
        return datetime.datetime.now().strftime(self.filename_pattern)

    @filename.setter
    def filename(self, _):
        pass

    def _matching_files(self):
        """Generate DirEntry entries that match the filename pattern.

        The files are ordered by their last modification time, most recent
        files first.

        """
        matches = []
        basename = os.path.basename(self.filename_pattern)
        pattern = re.compile(re.sub('%[a-zA-z]', '.*', basename))

        for entry in os.scandir(os.path.dirname(self.filename_pattern)):
            if not entry.is_file():
                continue
            entry_basename = os.path.basename(entry.path)
            if re.match(pattern, entry_basename):
                matches.append(entry)
        matches.sort(key=lambda e: e.stat().st_mtime, reverse=True)
        return iter(matches)


def create_timed_rotating_log(path):
    """"""
    logger = logging.getLogger("Rotating Log")
    logger.setLevel(logging.INFO)

    handler = TimedPatternFileHandler('{}_%H-%M-%S.log'.format(path), mode='a', backup_count=5)

    logger.addHandler(handler)
    logger.info("This is a test!")

答案 2 :(得分:0)

获取日期/时间。有关如何获取时间戳的信息,请参阅此answer。如果文件比当前日期早3个月。然后用

删除它
import os
os.remove("filename.extension")

将此文件保存到py2exe,然后使用任何任务计划程序在启动时运行此作业。

Windows :打开运行命令并输入 shell:startup ,然后将您的exe放在此处。

在OSX上:过去以旧的方式来创建一个cron作业,这在许多情况下根据我的经验不再起作用,但仍在尝试。苹果新推荐的方式是CreatingLaunchdJobs。您也可以参考此topic获取更详细的说明。

答案 3 :(得分:0)

日志记录模块具有内置的TimedRotatingFileHandler:

# get named logger
logger = logging.getLogger(__name__)

# create handler
handler = TimedRotatingFileHandler(filename='runtime.log', when='D', interval=1, backupCount=90, encoding='utf-8', delay=False)

# create formatter and add to handler
formatter = Formatter(fmt='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)

# add the handler to named logger
logger.addHandler(handler)

# set the logging level
logger.setLevel(logging.INFO)

# --------------------------------------

# log something
logger.info("test")

旧日志会自动添加时间戳。

每天都会创建一个新的备份。

如果存在超过91个(当前+备份)文件,则最早的文件将被删除。