我的日志文件有多个模式,包括JSON格式的日志。我想在grok插件中解析多个模式,但它似乎没有工作。
'filter {grok { break_on_match => false
match =>[ "message", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}",
"message","%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{IP:Clicnet} - - %{GREEDYDATA:Line}"]}
json {source => "Line"}mutate{remove_field => [ "Line","ThreadID" ]}}'
即使成功解析了JSON字符串的行,也有grokparsefailure标记。
2017-01-27 11:54:48 INFO PropertiesReader:33 - {“timestamp”:1485518878968,“h”:“297268184dde”,“l”:“INFO”,“cN”:“org .com.logstash.demo“,”mN“:”loadProperties“,”m“:”从/ var / tmp / conf加载属性文件“}
{
"message" => "2017-01-27 11:54:48 INFO PropertiesReader:33 - {\"timestamp\":1485518878968,\"h\":\"297268184dde\", \"l\":\"INFO\", \"cN\":\"org.com.logstash.demo\", \"mN\":\"loadProperties\", \"m\":\"load property file from /var/tmp/conf\"}",
"@version" => "1",
"@timestamp" => "2017-03-20T17:19:16.316Z",
"type" => "stdin",
"host" => "ef3b82",
"LogDate" => "2017-01-27 11:54:48",
"loglevel" => "INFO",
"threadName" => "PropertiesReader",
"tags" => [
[0] "_grokparsefailure"
],
"timestamp" => 1485518878968,
"h" => "297268184dde",
"l" => "INFO",
"cN" => "org.com.logstash.demo",
"mN" => "loadProperties",
"m" => "load property file from /var/tmp/conf"
}
和没有JSON的第二行完全失败
2017-01-20 15:46:16 INFO RequestLog:60 - 10.252.134.34 - - [20 / Jan / 2017:15:46:16 +0000]“OPTIONS //127.0.0.0:8080 / HTTP / 1.1“404 237 1
Error parsing json {:source=>"Line", :raw=>["10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237 1", "[20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237 1"], :exception=>java.lang.ClassCastException: org.jruby.RubyArray cannot be cast to org.jruby.RubyIO, :level=>:warn}
{
"message" => "2017-01-20 15:46:16 INFO RequestLog:60 - 10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237 1",
"@version" => "1",
"@timestamp" => "2017-03-20T17:19:51.175Z",
"type" => "stdin",
"host" => "ef3b82",
"LogDate" => [
[0] "2017-01-20 15:46:16",
[1] "2017-01-20 15:46:16"
],
"loglevel" => [
[0] "INFO",
[1] "INFO"
],
"threadName" => [
[0] " RequestLog",
[1] " RequestLog"
],
"Clicnet" => "10.252.134.34",
"tags" => [
[0] "_jsonparsefailure"
]
}
答案 0 :(得分:1)
花了5个小时后,我设法找到了解决方案。在下面用于成功解析两个日志行的模式
/opt/logstash/bin/logstash -e 'filter {grok { match =>{ "message" =>["%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadName} - %{IP:Client} - - %{GREEDYDATA:LogMessage}", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}"]}} json {source => "Line"} mutate{remove_field => [ "Line","ThreadID" ]}}'