在logstash中使用多行编解码器组合多个消息字段?

时间:2017-05-10 15:46:38

标签: elasticsearch logstash

我正在使用logstash 2.4.0

My output is like this:

{
      "@timestamp" => "2017-05-10T18:14:47.269Z",
         "message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][3] took[50ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",
        "@version" => "1",
            "path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
            "host" => "yaswanth",
       "TIMESTAMP" => "2017-01-14 10:59:58,591",
           "LEVEL" => "WARN",
           "QUERY" => "index.search.slowlog.query",
          "QUERY1" => "yaswanth",
      "INDEX-NAME" => "bank",
           "SHARD" => "3",
            "TOOK" => "50ms",
           "TOOKM" => 50,
           "types" => "details",
     "search_type" => "QUERY_THEN_FETCH",
    "total_shards" => "5",
    "source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
}
{
      "@timestamp" => "2017-05-10T18:14:47.270Z",
         "message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][2] took[50.2ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",
        "@version" => "1",
            "path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
            "host" => "yaswanth",
       "TIMESTAMP" => "2017-01-14 10:59:58,591",
           "LEVEL" => "WARN",
           "QUERY" => "index.search.slowlog.query",
          "QUERY1" => "yaswanth",
      "INDEX-NAME" => "bank",
           "SHARD" => "2",
            "TOOK" => "50.2ms",
           "TOOKM" => 50,
           "types" => "details",
     "search_type" => "QUERY_THEN_FETCH",
    "total_shards" => "5",
    "source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
}

但我想要的是这样的

{
          "@timestamp" => "2017-05-10T18:14:47.269Z",
             "message" => "[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][3] took[50ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r",[2017-01-14 10:59:58,591][WARN ][index.search.slowlog.query] [yaswanth] [bank][2] took[50.2ms], took_millis[50], types[details], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}], extra_source[], \r"
            "@version" => "1",
                "path" => "F:\\logstash-2.4.0\\logstash-2.4.0\\bin\\picaso.txt",
                "host" => "yaswanth",
           "TIMESTAMP" => "2017-01-14 10:59:58,591",
               "LEVEL" => "WARN",
               "QUERY" => "index.search.slowlog.query",
              "QUERY1" => "yaswanth",
          "INDEX-NAME" => "bank",
               "SHARD" => "3",
                "TOOK" => "50ms",
               "TOOKM" => 50,
               "types" => "details",
         "search_type" => "QUERY_THEN_FETCH",
        "total_shards" => "5",
        "source_query" => "{\"sort\":[{\"balance\":{\"order\":\"asc\"}}]}"
    }

我想将多个事件中的所有消息字段发送到单个事件以发送电子邮件。

以上配置有什么问题吗?我是否必须使用聚合过滤器来满足此类要求?

由于

1 个答案:

答案 0 :(得分:0)

您可以做的是在将file input plugin级别的事件发送到输出插件之前聚合这些事件。给出了一个很好的例子here

您可能需要稍微修改一下grok过滤器。