文件输入add_field不向每一行添加字段

时间:2014-05-16 13:36:21

标签: logstash

我正在使用我的logstash配置解析不同负载均衡服务器集群的几个日志文件,并希望添加一个字段" log_origin"到每个文件的条目,以便以后轻松过滤。

以下是一个简单示例中的input->文件配置:

input {
  file {
    type => "node1"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node2"
    path => "C:/Development/node2/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node3"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node4"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
}

filter {
    grok {
        match => [
            "message","%{DATESTAMP:log_timestamp}%{SPACE}\[%{DATA:class}\]%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{GREEDYDATA:log_message}"
        ]
    }

    date { 
        match => [ "log_timestamp",  "dd.MM.YY HH:mm:ss", "ISO8601" ]
        target => "@timestamp"
    }

    mutate {
        lowercase => ["loglevel"]
        strip     => ["loglevel"]
    }

    if "_grokparsefailure" in [tags] {
        multiline {
            pattern   => ".*"
            what      => "previous"
        }
    }

    if[fields.log_origin] == "live_logs"{
        if [type] == "node1" {
            mutate { 
                add_tag => "realsServerName1"
            }
        }
        if [type] == "node2" {
            mutate { 
                add_tag => "realsServerName2"
            }
        }
        if [type] == "node3" {
            mutate { 
                add_tag => "realsServerName3"
             }
        }
        if [type] == "node4" {
            mutate { 
                add_tag => "realsServerName4"
            }
        }
    }
}

output {
  stdout { }
  elasticsearch { embedded => true }
}

我原本希望logstash添加这个字段,并给它找到的每个logentry赋值,但它没有。也许我在这里采取了错误的做法?

编辑:我无法直接从节点检索日志,但必须将它们复制到我的"服务器"。否则我将能够使用文件路径来区分不同的集群...

编辑:它正在运作。我应该在两者之间清理我的数据。没有字段的旧条目增加了我的结果。

1 个答案:

答案 0 :(得分:6)

add_field需要哈希值。它应该是

add_field => {
 "log_origin" => "live_logs" 
}