尝试解析来自rsylog服务器的日志并将其插入到elasticsearch中 我的传入日志是
Feb 13 01:17:11 xxxx xxx-xxxx_error 2016/02/13 01:17:02 [error] 13689#0: *1956118 open() "xxxxxx" failed (2: No such file or directory), client: xx.xx.xx.xx, server: xxxxx.xx, request: "xxxxxxx HTTP/1.1", host: "xxxxx.xx"
我正在使用以下logstash过滤器提取字段:
grok {
match => {
"message" => [
"(?<logstamp>\h{3} \d{2} \d{2}:\d{2}:\d{2}) %{WORD:hostname} (?<source>[^\s]+) (?<timestamp>\d{4}/\d{2}/\d{2} \d{2}:\d{2}:\d{2}) %{GREEDYDATA:error_message}"
]
}
date {
locale => "en"
match => [ "timestamp", "yyyy/MM/dd HH:mm:ss" ]
}
}
mutate {
remove_field => [ "@version", "_score", "message", "host", "_type", "logstamp" ]
}
根据http://grokdebug.herokuapp.com/,我的语法是理智的 我在日志行中有两个日期,因为第一个是rsyslog收到行的时候,第二个来自nginx。我想要的是将第二个传递给&#34;时间戳&#34;。
我在logstash中遇到的错误是:
@metadata_accessors=#<LogStash::Util::Accessors:0x1d630482 @store={"path"=>"..."}, @lut={"[path]"=>[{"path"=>"..."},
"path"]}>, @cancelled=false>], :response=>{"create"=>{"_index"=>"...", "_type"=>"...", "_id"=>"...", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception",
"reason"=>"Invalid format: \"2016/02/16 12:25:16\" is malformed at \"/02/16 12:25:16\""}}}}, :level=>:warn}
(我剪切了输出以缩短输出)
编辑:工作配置
我最终将时间戳从Nginx日志转换为更标准的日志(如ruby
部分所示),并在date
匹配中将其用作 @timestamp 。
grok {
match => {
"message" => [
"(?<logstamp>\h{3} \d{2} \d{2}:\d{2}:\d{2}) %{WORD:hostname} (?<source>[^\s]+) (?<ngxstamp>[^\s]+ [^\s]+) %{GREEDYDATA:error_message}"
]
}
}
ruby {
code => "event['ngxstamp'] = event.timestamp.time.localtime.strftime('%Y-%m-%d %H:%M:%S')"
}
date {
match => [ "ngxstamp", "yyyy-MM-dd HH:mm:ss" ]
locale => "en"
}
mutate {
remove_field => [ "@version", "_score", "message", "host", "_type", "logstamp" ]
}
答案 0 :(得分:3)
由于timestamp
字段的类型为strict_date_optional_time
,您应在date
过滤器中使用的日期模式应为
yyyy-MM-dd HH:mm:ss
而不是
yyyy/mm/dd HH:mm:ss
所以:
MM
代替mm
这几个月日期和时间部分之间缺少T
可能仍然存在问题,因为strict_date_optional_time
强制要求它。