Elasticsearch DSL查询以从Kibana日志中过滤重复文档(基于过滤器)

时间:2019-06-11 15:35:02

标签: elasticsearch logging kibana elasticsearch-aggregation elasticsearch-dsl

我正在使用一个提供电子邮件,SMS等的系统。我正在通过ELK管道将应用程序日志发布到Kibana。

日志看起来像这样:

{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "ABC",
    "message": "delivered successfully",
}
{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "ABC",
    "message": "delivered successfully",
}
{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "XYZ",
    "message": "delivered successfully",
}
{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "DEF",
    "message": "delivered successfully",
}
{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "DEF",
    "message": "delivered successfully",
}
{
    "@timestamp": "2019-06-11T13:52:57.842914+00:00",
    "severity": "debug",
    "messageId": "DEF",
    "message": "delivered successfully",
}

我的要求是创建一个Kibana过滤器,以渲染所有不止一次传递的消息(即,多次报告“成功传递”的消息,就像上面的ID“ ABC”和“ DEF”一样)。< / p>

我正在尝试添加一个过滤器,该过滤器可以返回如下内容:

ABC - 2
DEF - 3

在给定的时间范围内,我们还可以总计得到此类消息的数量:以上情况为2

我一直在尝试结合使用match和regex检查以及Elasticsearch DSL的aggs组件,但是到目前为止还没有成功。

任何有关此操作的想法/示例都将受到高度赞赏。

0 个答案:

没有答案