使用已清理的csv日志python进行页面匹配分析

时间:2018-12-18 13:37:20

标签: python logging filtering logfile-analysis

下面是尝试清理我的csv日志的代码,当我运行该代码时出现错误; **

  

回溯(最近一次通话最后一次):文件“ page_hit_analysis.py”,行   12,在       line =解析器(行)

**

import apache_log_parser
from collections import Counter
from pandas import DataFrame
import seaborn


parser = apache_log_parser.make_parser('%h %l %u %t "%r" %>s')

pages = []
with open('cleaned_log7.csv') as in_f:
    for line in in_f:
        line = parser(line)
        pages.append(line['request_url'])

counts = Counter(pages)

selected_pages = [pair[0] for pair in counts.most_common(5)]
print(selected_pages)

graph_pages = [page for page in pages if page in selected_pages]
data = DataFrame({'pages': graph_pages})
print(data)

plot = seaborn.countplot(data=data, x='pages', order=selected_pages)
plot.get_figure().savefig('pages_plot7.png')

以上代码适用于未清除的日志,但不适用于已清除的日志。

0 个答案:

没有答案