Logstash更改从filebeat

时间:2017-07-18 10:23:14

标签: elasticsearch timestamp logstash filebeat

我注意到由filebeat正确定义的@timestamp字段由logstash自动更改,其值将替换为日志时间戳值(字段名称为a_timestamp)。 以下是logstash调试日志的一部分:

[2017-07-18T11:55:03,598] [DEBUG] [logstash.pipeline]过滤器已收到{" event" => {" @ ****时间戳& #34; => 2017-07-18T09:54:53.507Z ,"偏移" => 498," @版本" =>" 1"," input_type" =>" log"," beat" => {"主机名" =>&# 34; centos-ea"," name" =>" filebeat_shipper_kp","版本" =>" 5.5.0" }," host" =>" centos-ea"," source" =>" / home / elastic / ELASTIC_NEW / log_bw / test。 log"," message" =>" 2017-06-05 19:02:46.779 INFO [bwEngThread:In-Memory Process Worker-4] psg.logger - a_applicationName = \& #34; PieceProxy \",a_processName = \" piece.PieceProxy \",a_jobId = \" bw0a10ao \",a_processInstanceId = \" bw0a10ao \& #34;,a_level = \" Info \",a_phase = \" ProcessStart \",a_activityName = \" SetAndLog \", a_timeStamp = \" 2017-06-05T19:02:46.779 \",a_sessionId = \" \" ,a_sender = \" PCS \",a_cruid = \" 37d7e225-bbe5-425b-8abc-f4b44a5a1560 \",a_MachineCode = \" CFDM7757 \" ,a_correlationId = \" fa10f \",a_trackingId = \" 9d3b8 \",a_message = \" START = piece.PieceProxy \"" ,"输入" =>" log","标签" => [" beats_input_codec_plain_applied"]}}

[2017-07-18T11:55:03,629] [DEBUG] [logstash.pipeline]输出收到{" event" => {" a_message" =>&# 34; START = piece.PieceProxy"," log" =>" INFO"," bwthread" =>" [bwEngThread: In-Memory Process Worker-4]"," logger" =>" psg.logger"," a_correlationId" =>&#34 ; fa10f"," source" =>" /home/elastic/ELASTIC_NEW/log_bw/test.log" ;," a_trackingId" =>&# 34; 9d3b8","键入" =>" log"," a_sessionId" =>" \" \&# 34;"," a_sender" =>" PCS"," @ version" =>" 1",&# 34; beat" => {"主机名" =>" centos-ea","名称" =>" filebeat_shipper_kp&#34 ;,"版本" =>" 5.5.0"},"主机" =>" centos-ea",&#34 ; a_level" =>"信息"," a_processName" =>" piece.PieceProxy"," a_cruid" => " 37d7e225-bbe5-425b-8abc-f4b44a5a1560"," a_activityName" =>" SetAndLog"," offset" => 498," a_MachineCode" =>" CFDM7757",& #34; input_type" =>" log"," message" =>" 2017-06-05 19:02:46.779 INFO [bwEngThread:In- Memory Process Worker-4] psg.logger - a_applicationName = \" PieceProxy \",a_processName = \" piece.PieceProxy \",a_jobId = \" bw0a10ao \& #34;,a_processInstanceId = \" bw0a10ao \",a_level = \" Info \",a_phase = \" ProcessStart \",a_activityName = \& #34; SetAndLog \",a_timeStamp = \" 2017-06-05T19:02:46.779 \",a_sessionId = \" \",a_sender = \&# 34; PCS \",a_cruid = \" 37d7e225-bbe5-425b-8abc-f4b44a5a1560 \",a_MachineCode = \" CFDM7757 \",a_correlationId = \&# 34; fa10f \",a_trackingId = \" 9d3b8 \",a_message = \" START = piece.PieceProxy \""," a_phase& #34; =>" ProcessStart","标签" => [" beats_input_codec_plain_applied"," _dateparsefailure"," kv_ok& #34;," t askStarted"]," a_processInstanceId" =>" bw0a10ao"," @ timestamp" => 2017-06-05T17:02:46.779 Z ," my_index" =>" bw_logs"," a_timeStamp" =>" 2017-06-05T19:02:46.779 "," a_jobId" =>" bw0a10ao"," a_applicationName" =>" PieceProxy"," TMS& #34; =>" 2017-06-05 19:02:46.779"}}

NB:

  1. 我注意到这不会发生在一个简单的管道上(没有我在自定义管道中使用的grok,kv和其他插件)
  2. 我将filebeat的属性json.overwrite_keys更改为TRUE但没有成功。
  3. 你能解释一下为什么以及@_timestamp改变会发生什么?我不希望它自动完成(我看到很多人都在问这个怎么做),因为@timestamp是一个系统领域。那有什么问题?

    这是我的管道:

    input { 
       beats {
            port => "5043"
            type => json
        }
    }
    filter {    
          #date {
          #  match => [ "@timestamp", "ISO8601" ]
          #  target => "@timestamp"
          #}
    
        if "log_bw" in [source] {
                    grok {
                        patterns_dir => ["/home/elastic/ELASTIC_NEW/logstash-5.5.0/config/patterns/extrapatterns"]
                        match => { "message" => "%{CUSTOM_TMS:TMS}\s*%{CUSTOM_LOGLEVEL:log}\s*%{CUSTOM_THREAD:bwthread}\s*%{CUSTOM_LOGGER:logger}-%{CUSTOM_TEXT:text}" }    
                        tag_on_failure => ["no_match"]
                    }
    
                    if "no_match" not in [tags] {
    
                        if "Payload for Request is" in [text] {
    
                            mutate {
                                add_field => { "my_index" => "json_request" }
                            }                                       
    
                            grok {
                                patterns_dir => ["/home/elastic/ELASTIC_NEW/logstash-5.5.0/config/patterns/extrapatterns"]
                                match => { "text" => "%{CUSTOM_JSON:json_message}" }
                            }
    
                            json {
                                source => "json_message"
                                tag_on_failure => ["errore_parser_json"]
                                target => "json_request"
                            }
    
                            mutate {
                                remove_field => [ "json_message", "text" ]
                            }
                        }
                        else {
    
                            mutate {
                                add_field => { "my_index" => "bw_logs" }
                            }
    
                            kv {
                                source => "text"
                                trim_key => "\s"
                                field_split => ","
                                add_tag => [ "kv_ok" ]
                            }
    
                            if "kv_ok" not in [tags] {
                                drop { }
                            }
    
                            else {
    
                                mutate {
                                    remove_field => [ "text" ]
                                }
    
                                if "ProcessStart" in [a_phase] {
                                    mutate {
                                        add_tag => [ "taskStarted" ]
                                    }
                                }
    
                                if "ProcessEnd" in [a_phase] {
                                    mutate {
                                        add_tag => [ "taskTerminated" ]
                                    }
                                }
    
                                date {
                                    match => [ "a_timeStamp", "yyyy'-'MM'-'dd'T'HH:mm:ss.SSS" ]
                                }
    
                                elapsed {
                                    start_tag => "taskStarted"
                                    end_tag => "taskTerminated"
                                    unique_id_field => "a_cruid"
                                }
                            }
                        }       
                    }
        }
        else {
    
            mutate {
                add_field => { "my_index" => "other_products" } 
            }
        }
    }
    output {
    
            elasticsearch { 
                index => "%{my_index}"
                hosts => ["localhost:9200"] 
            }
    
            stdout { codec => rubydebug }
    
            file {
                path => "/tmp/loggata.tx"
                codec => json
            }
    }
    

    非常感谢,

    安德烈

1 个答案:

答案 0 :(得分:0)

这是错误(以前测试中的拼写错误):

date {
    match => [ "a_timeStamp", "yyyy'-'MM'-'dd'T'HH:mm:ss.SSS" ]
}

谢谢你们,无论如何!