我正在使用logstash 2.4.0和logstash 2.4.0 我想使用logstash将slowlogs发送到.csv输出文件。我的配置文件是这样的
input {
file {
path => "D:\logstash-2.4.0\logstash-2.4.0\bin\rachu.log"
start_position => "beginning"
}
}
filter {
grok {
match => [ "message",
"\[%{TIMESTAMP_ISO8601:TIMESTAMP}\]\[%{LOGLEVEL:LEVEL}%{SPACE}\]\[%{DATA:QUERY}\]%{SPACE}\[%{DATA:QUERY1}\]%{SPACE}\[%{DATA:INDEX-NAME}\]\[%{DATA:SHARD}\]%{SPACE}took\[%{DATA:TOOK}\],%{SPACE}took_millis\[%{DATA:TOOKM}\], types\[%{DATA:types}\], stats\[%{DATA:stats}\],search_type\[%{DATA:search_type}\], total_shards\[%{NUMBER:total_shards}\], source\[%{DATA:source_query}\], extra_source\[%{DATA:extra_source}\],"]
}
}
output {
csv {
fields => ["TIMESTAMP","LEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","TOOKM","types","stats","search_type","total_shards","source_query","extra_source"]
path => "D:\logstash-2.4.0\logstash-2.4.0\bin\logoutput.csv"
spreadsheet_safe => false
}
}
答案 0 :(得分:2)
csv
过滤器在您的上下文中无用。它的目标是解析传入的CSV数据,但这不是你拥有的。您需要的是先用grok
过滤器解析日志行,然后才能将其正确发送到csv
输出:
filter {
grok {
match => {"message" => "\[%{TIMESTAMP_ISO8601:TIMESTAMP}\]\[%{LOGLEVEL:LOGLEVEL} \]\[%{DATA:QUERY}\] \[%{WORD:QUERY1}\] \[%{WORD:INDEX}\]\[%{INT:SHARD}\] took\[%{BASE10NUM:TOOK}ms\], took_millis\[%{BASE10NUM:took_millis}\], types\[%{DATA:types}\], stats\[%{DATA:stats}\], search_type\[%{DATA:search_type}\], total_shards\[%{INT:total_shards}\], source\[%{DATA:source}\], extra_source\[%{DATA:extra_source}\]"}
}
}
output {
csv {
fields => ["TIMESTAMP","LOGLEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","took_millis","types","stats","search_type","total_shards","source_query","extra_source"]
path => "F:\logstash-5.1.1\logstash-5.1.1\finaloutput1"
spreadsheet_safe => false
}
}
注意:由于this open issue,这在Logstash 5.1.1上还不起作用。它应该很快得到修复,但同时这适用于Logstash 2.4。