使用Boto3 - delete_log_group批量删除Cloudwatch日志组

时间:2017-06-05 17:28:44

标签: python json amazon-web-services boto3 amazon-cloudwatch

我有一个非常冗长的cloudwatch日志组列表,我需要删除....就像接近一百个。既然你必须一次删除它们,我认为一个小的python脚本可以帮助我,但现在我卡住了。

这是我的剧本到目前为止......

import boto3
from botocore.exceptions import ClientError
import json

#Connect to AWS using default AWS credentials in awscli config
cwlogs = boto3.client('logs')

loglist = cwlogs.describe_log_groups(
    logGroupNamePrefix='/aws/lambda/staging-east1-'
)

#writes json output to file...
with open('loglist.json', 'w') as outfile:
    json.dump(loglist, outfile, ensure_ascii=False, indent=4, 
sort_keys=True)

#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
    file_parsed = json.load(f)

for i in file_parsed['logGroups']:
    print i['logGroupName']


#   cwlogs.delete_log_group(
#       logGroupName='string'   <---here is where im stuck
#   )

如何获取&#39; logGroupName&#39;的值?在i中并将其转换为delete_log_group命令可以使用的字符串并迭代以删除我需要消失的所有日志组? 我尝试使用json.loads,并且错误地使用以下内容...

Traceback(最近一次调用最后一次): 文件&#34; CWLogCleaner.py&#34;,第18行,in file_parsed = json.loads(f) 文件&#34; /usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/ init .py&#34;,第339行,载入中 return _default_decoder.decode(s) 文件&#34; /usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py" ;,第364行,解码 obj,end = self.raw_decode(s,idx = _w(s,0).end())

或者我完全以错误的方式解决这个问题?

TIA

3 个答案:

答案 0 :(得分:2)

除非您特别需要将JSON响应保存到磁盘以用于其他目的,否则您可以简单地使用此代码的某些变体:

import boto3

# --------------------------------------------------------------
# Delete all CloudWatch log streams.
# --------------------------------------------------------------
def delete_log_streams():
    next_token = None
    logs = boto3.client('logs')
    log_groups = logs.describe_log_groups()

    for log_group in log_groups['logGroups']:
        log_group_name = log_group['logGroupName']
        print("Delete log group:", log_group_name)

        while True:
            if next_token:
                log_streams = logs.describe_log_streams(logGroupName=log_group_name,
                                                        nextToken=next_token)
            else:
                log_streams = logs.describe_log_streams(logGroupName=log_group_name)

            next_token = log_streams.get('nextToken', None)

            for stream in log_streams['logStreams']:
                log_stream_name = stream['logStreamName']
                print("Delete log stream:", log_stream_name)
                # delete_log_stream(log_group_name, log_stream_name, logs)

            if not next_token or len(log_streams['logStreams']) == 0:
                break

答案 1 :(得分:1)

这是我为我工作的内容。我确定这是hackey,我不是开发人员,但它对我有用......

cwlogs = boto3.client('logs')

loglist = cwlogs.describe_log_groups(
    logGroupNamePrefix='ENTER NAME OF YOUR LOG GROUP HERE'
)

#writes json output to file...
with open('loglist.json', 'w') as outfile:
    json.dump(loglist, outfile, ensure_ascii=False, indent=4, 
sort_keys=True)

#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
    file_parsed = json.load(f)

for i in file_parsed['logGroups']:
    print i['logGroupName']

for i in file_parsed['logGroups']:
    cwlogs.delete_log_group(
        logGroupName=(i['logGroupName'])
    )

答案 2 :(得分:1)

这里的解决方案都无法按照我想要的方式工作(有些是由于分页),所以我建立了自己的脚本。这将删除7天以上的日志。您可以选择更改时间增量,也可以将其设置为0,或者删除删除日期条件以删除所有日志。

from datetime import datetime, timedelta

import boto3

app_name = 'your function name here'


def login():
    client = boto3.client('logs')
    paginator = client.get_paginator('describe_log_streams')
    response_iterator = paginator.paginate(
        logGroupName=f'/aws/lambda/{app_name}',
    )
    return client, response_iterator


def deletion_date():
    tod = datetime.today() - timedelta(days=7)
    epoch_date = str(int(tod.timestamp()))
    selected_date = int(epoch_date.ljust(13, '0'))
    return selected_date


def purger():
    n = 0
    print('Deleting log files..')
    for item in response:
        collection = item['logStreams']
        for collected_value in collection:
            if collected_value['creationTime'] < req_date:
                resp = client_.delete_log_stream(
                    logGroupName=f'/aws/lambda/{app_name}',
                    logStreamName=f"{collected_value['logStreamName']}"
                )
                n = n + 1
                if (resp['ResponseMetadata']['HTTPStatusCode']) == 200:
                    pass
                else:
                    print(f"Unable to purge logStream: {collected_value['logStreamName']}")
                    pass
    return n


if __name__ == '__main__':
    client_, response = login()
    req_date = deletion_date()
    print(f'\n{purger()} log streams were purged for the function {app_name}')