Logstash noob在这里,我试图通过logstash过滤这些日志行。
2015-03-31 02:53:39 INFO This is info message 5
我使用的配置文件是:
input {
file {
path => "/sample/log4j_log.log"
start_position => beginning
}
}
filter {
grok {
match => [ "message" , "%{DATESTAMP:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
date {
locale => "en"
match => [ "logtimestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
#elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
我得到的输出是
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "0015-03-30T21:00:11.000Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
我看到logtimestamp字段显示格式为“YY-MM-dd HH:mm:ss”,我不知道为什么它会被转换为他的格式,我甚至在日期过滤器中尝试过。 在这些情况下,我得到了这个输出。
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-04-07T17:55:51.231Z",
"host" => "abc",
"path" => "/sample/log4j_log.log",
"logtimestamp" => "15-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}
在所有这些中,@ timestamp与实际的日志事件时间戳不匹配,这会导致弹性搜索+ kibana可视化问题。
我试图包含target => “@timestamp”,locale =>如“StackOverflow”上的其他问题所示,“en”没有成功。
我似乎没有尝试过的唯一一件事是: Logstash date parsing as timestamp using the date filter 我不相信这完全适用于我的日志事件。
答案 0 :(得分:1)
您的grok模式不正确。
请更改为此,使用TIMESTAMP_ISO8601
代替DATESTAMP
grok {
match => [ "message" , "%{TIMESTAMP_ISO8601:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}
这是输出:
{
"message" => "2015-03-31 02:53:39 INFO This is info message 5",
"@version" => "1",
"@timestamp" => "2015-03-30T18:53:39.000Z",
"host" => "BEN_LIM",
"logtimestamp" => "2015-03-31 02:53:39",
"level" => "INFO",
"msg" => " This is info message 5"
}