我的日志模式的Grok过滤器

时间:2017-03-28 14:08:18

标签: logging logstash elastic-stack grok

我正在试验ELK。我尝试使用以下模式输入log to logstash

14:25:43.324 [http-nio-9090-exec-116] INFO  com.app.MainApp - Request has been detected

我在logstash.conf中尝试了以下grok模式作为过滤器

match => { “message” => [ “ (?<timestamp>%{HOUR}:%{MINUTE}:%{SECOND}) \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}

match => { “message” => [ “ %{TIME:timestamp} \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:Class}\- %{GREEDYDATA:message}“ ]}

但是当我将日志输入到logstash时,我收到以下错误

   [0] "_grokparsefailure"

有人可以为上述日志模式建议正确的grok过滤器吗?

1 个答案:

答案 0 :(得分:0)

删除起始空格后,此解析失败得到修复。因此删除空格后的工作logstash.conf如下所示

input {
  file {
    path => ["./debug.log"]
    codec => multiline {
      # Grok pattern names are valid! :)
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => previous
    }
  }
}

filter {
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}


output {
  elasticsearch { hosts => localhost }
  stdout { codec => rubydebug }
}