如何使用grok和json过滤器在logstash中使用嵌套的json数组解析日志?

时间:2015-09-07 08:47:23

标签: arrays json elasticsearch logstash logstash-grok

我试图用嵌套的json数组解析日志。当我使用json过滤器时,它适用于第一级解析,无论如何解析日志但是嵌套数组没有被解析,当我尝试查看时,我在Kibana中得到错误。
 这是日志:

    {
        "a": "one",
        "b": "two",
        "c": {
            "alpha": "awesome"
        },
        "d": "three",
        "e": [
            {
                "f": "four",
                "g": "five,six,seven"
            },
            {}
        ],
        "close": ""
    }

我使用的配置是:

    input     {   
        file     {
            codec => multiline
            {
                pattern => "^\]" 
                negate => true
                what => previous               
            }
        type => "log"
            tags => "name"
            path => ["/path/to/log" ]
            start_position => "beginning"
            sincedb_path => "/dev/null"
        }
    }
    filter {

      grok {

        match => { "message" => "%{TIMESTAMP:} %{GREEDYDATA:}" }

      }

    json {  source => "payload" }
    }

    output {
    elasticsearch {
        index => "logstash-%{type}"
        host => "localhost"
        port => "9200"
        cluster => "elasticsearch"
      }
     stdout { codec => rubydebug }
    }

此配置解析日志,但不处理嵌套的json数组,Kibana会给出错误。

0 个答案:

没有答案