我当天有多个日志文件。我想要做的是使用Perl
脚本根据日志中的时间戳将它们合并为单个。
登入1.登录
2014-06-02 21:54:38,805 INFO com.HomeManeger [Executor:Thread-19]: MyInfo started for myid=TEST-401406
2014-06-02 21:56:27,358 INFO com.HomeManeger [Executor:Thread-13]: HomeManeger: populateMyInfo completed for my id = TEST-401406,
2014-06-02 21:59:32,358 INFO com.HomeManeger [Executor:Thread-17]: MyInfo completed for myid=TEST-401405
登入2.登录
2014-06-02 21:56:27,295 INFO com.homeManeger.MyCommand [Proxy:ProxyService:TcpWorker:2]: MyCommand::Processing reqest[AB:MyInfo] obj(Collection [ID={005A004A5B0F9}, ]
) client(POFFBObj [ID={XXXXXX-E8F5-11D5-YYY-0002B33D9D0C}, meta={}, fields=[XXX]]
)
2014-06-02 21:58:27,310 INFO com.HomeManeger.UpdateMyInfoTask
合并日志
2014-06-02 21:54:38,805 INFO com.HomeManeger [Executor:Thread-19]: MyInfo started for myid=TEST-401406
2014-06-02 21:56:27,295 INFO com.homeManeger.MyCommand [Proxy:ProxyService:TcpWorker:2]: MyCommand::Processing reqest[AB:MyInfo] obj(Collection[ID={005A004A5B0F9}, ]
) client(POFFBObj [ID={XXXXXX-E8F5-11D5-YYY-0002B33D9D0C}, meta={}, fields=[XXX]]
)
2014-06-02 21:56:27,358 INFO com.HomeManeger [Executor:Thread-13]: HomeManeger: populateMyInfo completed for my id = TEST-401406,
2014-06-02 21:56:32,358 INFO com.HomeManeger [Executor:Thread-17]: MyInfo completed for myid=TEST-401405
2014-06-02 21:58:27,310 INFO com.HomeManeger.UpdateMyInfoTask
我对Perl
很新,任何帮助都会受到赞赏。
答案 0 :(得分:3)
以下脚本可以处理任意数量的日志文件,并将合并的日志输出到文件中。
它会将它们全部加载到内存中,因此大小是一个因素:
#!/usr/bin/perl
use warnings;
use strict;
die "Usage: perl $0 log1 log2 > merged.log\n" if !@ARGV;
my @lines;
while (<>) {
if (/^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}/) {
push @lines, $_;
} else {
$lines[-1] .= $_;
}
}
print sort @lines;
如果记忆是一个因素,您需要进行官方合并排序。
以下内容改编自今年3月的perl merge sort
:
use strict;
use warnings;
use autodie;
die "Usage: perl $0 log1 log2 > merged.log\n" if !@ARGV;
# Initialize File handles
my @fhs = map {open my $fh, '<', $_; $fh} @ARGV;
# First Line of each file
my @data = map {scalar <$_>} @fhs;
# Loop while a next line exists
while (@data) {
# Pull out the next entry.
my $index = (sort {$data[$a] cmp $data[$b]} (0..$#data))[0];
print $data[$index];
# Fill In next Data at index.
while (defined($data[$index] = readline $fhs[$index])) {
last if $data[$index] =~ /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}/;
print $data[$index];
}
# End of that File
if (! defined($data[$index]) {
splice @fhs, $index, 1;
splice @data, $index, 1;
}
}
答案 1 :(得分:1)
为什么不呢:
cat file1 file2 | sort > out
答案 2 :(得分:0)
你可以尝试:
#! /usr/bin/perl
use warnings;
use strict;
use Time::Piece;
use autodie;
my $fmt='%Y-%m-%d %H:%M:%S';
my @files=qw(Log-1.log Log-2.log);
my @lines;
for my $file (@files) {
open(my $fh, "<", $file);
while (my $line=<$fh>) {
my ($data)=$line=~/^(\S+\s+\S+?),/;
my $k = Time::Piece->strptime($data, $fmt);
push(@lines,{key=>$k, line=>$line});
}
close($fh);
}
for (sort {$a->{key} <=> $b->{key}} @lines) {
print $_->{line};
}
答案 3 :(得分:0)
您可以使用高级日志合并,但它仅适用于Windows http://www.softpedia.com/get/Others/Miscellaneous/Advanced-Logs-Merge.shtml