无法在logstash中解析我的自定义日志

时间:2017-08-24 09:47:03

标签: elasticsearch logstash kibana filebeat

以下是我的服务日志,我想将此日志解析为logstash。请建议一些解析日志的插件或方法。

"msgs": [{
        "ts": "2017-07-17T12:22:00.2657422Z",
        "tid": 4,
        "eid": 1,
        "lvl": "Information",
        "cat": "Microsoft.AspNetCore.Hosting.Internal.WebHost",
        "msg": {
            "cnt": "Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 601",
            "Protocol": "HTTP/1.1",
            "Method": "POST",
            "ContentType": "text/xml; charset=utf-8",
            "ContentLength": 601,
            "Scheme": "http",
            "Host": "localhost:20001",
            "PathBase": "",
            "Path": "/Processor",
            "QueryString": ""
        }
    },
    {
        "ts": "2017-07-17T12:22:00.4617773Z",
        "tid": 4,
        "lvl": "Information",
        "cat": "NCR.CP.Service.ServiceHostMiddleware",
        "msg": {
            "cnt": "REQ"
        },
        "data": {
            "Headers": {
                "Connection": "Keep-Alive",
                "Content-Length": "601",
                "Content-Type": "text/xml; charset=utf-8",
                "Accept-Encoding": "gzip, deflate",
                "Expect": "100-continue",
                "Host": "localhost:20001",
                "SOAPAction": "\"http://servereps.mtxeps.com/TransactionService/SendTransaction\""
            }
        }
    }]

请建议我在这种类型的日志上应用过滤器的方法,以便我可以从这些日志中取出字段并在kibana中可视化。

我听说过grok过滤器,但我必须在这里使用什么模式。

0 个答案:

没有答案