使用Python将日志轮换到目录中

时间:2011-03-04 07:41:01

标签: python logging rotation

我有一个名为Poller.log的文件,它始终附加日志详细信息。我希望每天轮换这个日志文件并限制为30天。因此,代码运行良好。

现在我希望已经旋转的日志位于文件夹中(即logs / poller.log.2011-03-04_15-36)。无论如何都要指向应该创建这个旋转文件的位置吗?

这个python脚本将由Cron执行。

import logging
import logging.handlers

LOG_FILENAME = '/home/stackoverflow/snmpdata/poller.log'

# Set up a specific logger with our desired output level
poll_logger = logging.getLogger('pollerLog')

# Add the log message handler to the logger
log_rotator = logging.handlers.TimedRotatingFileHandler(LOG_FILENAME, when='d', interval=1, backupCount=30, encoding=None, delay=False, utc=False)
poll_logger.addHandler(log_rotator)

# Roll over on application start
poll_logger.handlers[0].doRollover()

3 个答案:

答案 0 :(得分:4)

Python日志记录处理程序不允许轻松地执行此操作。你可能有两种方法来实现这个目标:

  1. 最简单的方法是将LOG_FILENAME设置为已经存在于logs / poller.log中,如果要在其他任何地方访问poller.log,请使用符号链接:)

  2. 从TimedRotatingFileHandler开始创建自己的处理程序,并从/usr/lib/python2.X/logging/handlers.py,TimedRotatingFileHandler类复制/粘贴doRollover()。并改变:

  3. dfn = self.baseFilename + "." + time.strftime(self.suffix, timeTuple)

    dfn = os.path.join('logs', os.path.basename(self.baseFilename)) + "." + time.strftime(self.suffix, timeTuple)

答案 1 :(得分:2)

如果您不介意额外的依赖性,您可以始终使用扭曲的翻转记录模块。 Twisted有一个日志文件模块,允许每日日志,每周日志,甚至是这种情况下的月度日志。

答案 2 :(得分:0)

我为单独的过程添加了这段代码,以将所有日志备份移动到文件夹。

import logging
import logging.handlers
import shutil, os, glob
import zipfile
import schedule
import time
import threading

zip_file_name = "Log.zip"
zip_file_path = "Logs/LogsArchive/Log.zip"

source_directory = "Logs"
archive_directory = "Logs/LogsArchive"


def moveAllFilesinDir(srcDir, dstDir, allLogs = False):
    try:
    # Check if both the are directories
        if os.path.isdir(srcDir) and os.path.isdir(dstDir):
            # Iterate over all the files in source directory

            if allLogs == False:
                for filePath in glob.glob(srcDir + '/*.*.*'):
                    # Move each file to destination Directory
                    shutil.move(filePath, dstDir)
            elif allLogs == True:
                for filePath in glob.glob(srcDir + '/*.*'):
                    # Move each file to destination Directory
                    shutil.copy(filePath, dstDir)

        else:
            debug_logger.debug("LoggingModule: - moveAllFilesinDir - srcDir & dstDir should be Directories")
    except Exception as ex:
        error_logger.error("Error in LoggingModule - moveAllFilesinDir", exc_info=True)


只有带有3个部分扩展名的日志文件将在“ name.log.date”上移动 我正在研究一个压缩存档文件夹的过程。

更新: 这是Zip程序

def createZipDir(path):
    #delete old zipfile if exists, but leave old zipfile if no other files exist
    if len(os.listdir(path)) > 1:
        zipFile = zip_file_path
        if os.path.isfile(zipFile):
            os.remove(zipFile)
        zipf = zipfile.ZipFile(zip_file_path, 'w', zipfile.ZIP_DEFLATED)
        for root, dirs, files in os.walk(path):
            for file in files:
                if file != zip_file_name:
                    zipf.write(os.path.join(root, file))
        zipf.close()
    else:
        debug_logger.debug("LoggingModule: - createZipDir - no files found, zip file left in place.")

删除旧文件:

def deleteOldFilesinDir(srcDir):
    try:
    # Check if both the are directories
        if os.path.isdir(srcDir):
            # Iterate over all the files in source directory
            for filePath in glob.glob(srcDir + '/*.*'):
                if filePath != zip_file_path:
                    os.remove(filePath)
        else:
            print("srcDir & dstDir should be Directories")
    except Exception as ex:
        error_logger.error("Error in LoggingModule - deleteOldFilesinDir", exc_info=True)

这是整个过程:

我按计划将runArchiveProcess设置为每周运行一次。


def runArchiveProcess(allFiles = False):
    debug_logger.debug("LoggingModule: Archive process started.")
    moveAllFilesinDir(source_directory, archive_directory, allFiles)
    createZipDir(archive_directory)
    deleteOldFilesinDir(archive_directory)
    debug_logger.debug("LoggingModule Archive process completed.")

和调度程序位:

#only kicked off in own thread...
def runScheduler():
    debug_logger.debug("LoggingModule - runScheduler - don't call this function outside of LoggingModule as it runs in own thread.")
    schedule.every().monday.at("00:00:00").do(runArchiveProcess)
    #schedule.every(10).seconds.do(runArchiveProcess).do(runArchiveProcess) #for testing

    try:
        while True:
            debug_logger.debug("LoggingModule checking scheduler...")
            #Checks whether a scheduled task is pending to run or not
            schedule.run_pending()
            debug_logger.debug("LoggingModule Scheduler sleeping...")
            time.sleep(60 * 60) # checks every 1 hour
            #time.sleep(10)  # for testing
    except Exception as ex:
        error_logger.error("Error in LoggingModule - runScheduler", exc_info=True)


def runSchedulerThread():
    thread = threading.Thread(target=runScheduler)
    thread.start()