使用grok过滤器解析检查点防火墙日志时遇到问题

时间:2015-04-30 16:12:25

标签: elasticsearch logstash grok

它们是检查点防火墙日志,它们看起来像......(first row = fields, second row and all the rows thereafter = values of the respective fields

 "Number" "Date" "Time" "Interface" "Origin" "Type" "Action" "Service" "Source Port" "Source" "Destination" "Protocol" "Rule" "Rule Name" "Current Rule Number" "Information" 
"7319452" "18Mar2015" "15:00:00" "eth1-04" "grog1" "Log" "Accept" "domain-udp" "20616" "172.16.36.250" "8.8.8.8" "udp" "7" "" "7-open_1" "inzone: Internal; outzone: External; service_id: domain-udp" "Security Gateway/Management"

我尝试通过在线获取一些代码(grok过滤器)来做到这一点。 我有一个文件,只有 "GoLpoT" "502"(包括引号)

以及一些读取此文件的代码,该文件粘贴在下面:

input {
  file {
    path => "/usr/local/bin/firewall_log"
  }
}

filter {
  grok {
    match => ["message", "%{WORD:type}\|%{NUMBER:nums}"]
  }
}

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

当我运行代码时,我收到以下错误

"message" => "",
      "@version" => "1",
    "@timestamp" => "2015-04-30T15:52:48.331Z",
          "host" => "UOD-220076",
          "path" => "/usr/local/bin/firewall_log",
          "tags" => [
        [0] "_grokparsefailure"

请帮助。

我的第二个问题 - 如何一起或单独解析DateTime? 日期不会改变 - 它是一天的所有日志 - 只是改变的时间。

非常感谢。

0 个答案:

没有答案