自5.2以来,grok过滤器因ISO8601时间戳而失败

时间:2017-02-05 15:13:05

标签: logstash elastic-stack logstash-grok

因为我已经将我们的ELK-stack从5.0.2升级到5.2,我们的grok过滤器失败了,我不知道为什么。也许我忽略了更改日志中的某些内容?

过滤

filter {
  if [type] == "nginx_access" {
    grok {
      match => { "message" => "%{IPORHOST:remote_addr} - %{USERNAME:remote_user} \[%{TIMESTAMP_ISO8601:timestamp}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{INT:status} %{INT:body_bytes_sent} %{QS:http_referer} %{QS:http_user_agent} \"%{DATA:host_uri}\" \"%{DATA:proxy}\" \"%{DATA:upstream_addr}\" \"%{WORD:cache_status}\" \[%{NUMBER:request_time}\] \[(?:%{NUMBER:proxy_response_time}|-)\]" }
      add_field => [ "received_at", "%{@timestamp}" ]
    }
    mutate {
      convert => {
        "proxy_response_time" => "float"
        "request_time" => "float"
        "body_bytes_sent" => "integer"
      }
    }
  }
}

错误

Invalid format: \"2017-02-05T15:55:38+01:00\" is malformed at \"-02-05T15:55:38+01:00\"

完整错误

[2017-02-05T15:55:49,500][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2017.02.05", :_type=>"nginx_access", :_routing=>nil}, 2017-02-05T14:55:38.000Z proxy2 4.3.2.1 - - [2017-02-05T15:55:38+01:00] "HEAD / HTTP/1.1" 200 0 "-" "Zabbix" "example.com" "host1:10040" "1.2.3.4:10040" "MISS" [0.095] [0.095]], :response=>{"index"=>{"_index"=>"filebeat-2017.02.05", "_type"=>"nginx_access", "_id"=>"AVoOxh7p5p68dsalXDFX", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2017-02-05T15:55:38+01:00\" is malformed at \"-02-05T15:55:38+01:00\""}}}}}

整个事情在http://grokconstructor.appspot.com上完美运作,TIMESTAMP_ISO8601似乎仍然是正确的选择(https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/grok-patterns

Techstack

  • Ubuntu 16.04
  • Elasticsearch 5.2.0
  • Logstash 5.2.0
  • Filebeat 5.2.0
  • Kibana 5.2.0

任何偶像?

干杯, 芬兰

更新

所以这个版本因某种原因起作用

filter {
  if [type] == "nginx_access" {
    grok {
      match => { "message" => "%{IPORHOST:remote_addr} - %{USERNAME:remote_user} \[%{TIMESTAMP_ISO8601:timestamp}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{INT:status} %{INT:body_bytes_sent} %{QS:http_referer} %{QS:http_user_agent} \"%{DATA:host_uri}\" \"%{DATA:proxy}\" \"%{DATA:upstream_addr}\" \"%{WORD:cache_status}\" \[%{NUMBER:request_time}\] \[(?:%{NUMBER:proxy_response_time}|-)\]" }
      add_field => [ "received_at", "%{@timestamp}" ]
    }
    date {
        match => [ "timestamp" , "yyyy-MM-dd'T'HH:mm:ssZ" ]
        target => "timestamp"
    }
    mutate {
      convert => {
        "proxy_response_time" => "float"
        "request_time" => "float"
        "body_bytes_sent" => "integer"
      }
    }
  }
}

如果有人可以解释为什么我必须重新定义有效的ISO8601日期,我很乐意知道。

1 个答案:

答案 0 :(得分:0)

请务必在文档中指定您期望的timestamp PUT index { "mappings": { "your_index_type": { "properties": { "date": { "type": "date", "format": "yyyy-MM-ddTHH:mm:ss+01:SS" <-- make sure to give the correct one } } } } } ,其映射可能如下所示:

timestamp

如果未正确指定, Elasticsearch 将以ISO格式预期时间戳值。 您可以为date { match => [ "timestamp" , "yyyy-MM-ddTHH:mm:ss+01:SS" ] <--match the timestamp (I'm not sure what +01:ss stands for, make sure it matches) target => "timestamp" locale => "en" timezone => "UTC" } 字段format执行date match,在过滤器中看起来像这样:

class User extends ActiveRecord {
    public function events()
    {
        return [
            User::EVENT_AFTER_INSERT => [$this, 'sendEmail']
        ];
    }
     public function sendEmail(){
        Yii::$app->mailer->compose()
            ->setTo($this->email)
            ->setFrom(['mail address' => 'name'])
            ->setSubject('Verify your Email')
            ->setHtmlBody('<p>Please click on the <a href="'.Yii::$app->request->hostInfo.'/'.Yii::$app->params['frontEndUrl'].'#/verify-email/'
                .$this->emailToken.
                '">link</a> to verify your email</p>')
            ->send();
    }
}

或者你可以添加一个新字段并将其与时间戳匹配,如果你没有,那么你可以删除它,如果你没有真正使用它,因为你有时间戳新领域。希望它有所帮助。