为名称值对编写grok表达式

时间:2018-02-06 16:32:15

标签: logstash logstash-grok

我从指标获得以下输出。

05 Feb 2018 16:02:37,076  INFO SaveMetrics:29 - Metrics  :[Metric [name=httpsessions.max, value=-1, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=httpsessions.active, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018]]
05 Feb 2018 16:02:37,085  INFO SaveMetrics:29 - Metrics  :[Metric [name=datasource.primary.active, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=datasource.primary.usage, value=0.0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=datasource.hcarsessioninfo.active, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=datasource.hcarsessioninfo.usage, value=0.0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=datasource.vhcpdemo.active, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=datasource.vhcpdemo.usage, value=0.0, timestamp=Mon Feb 05 16:02:37 EST 2018]]
05 Feb 2018 16:02:37,086  INFO SaveMetrics:29 - Metrics  :[Metric [name=mem, value=854991, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=mem.free, value=441374, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=processors, value=8, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=instance.uptime, value=2520, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=uptime, value=78701, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=systemload.average, value=-1.0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=heap.committed, value=733184, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=heap.init, value=262144, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=heap.used, value=291809, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=heap, value=3708416, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=nonheap.committed, value=124120, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=nonheap.init, value=2496, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=nonheap.used, value=121807, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=nonheap, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=threads.peak, value=77, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=threads.daemon, value=13, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=threads.totalStarted, value=97, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=threads, value=77, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=classes, value=13851, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=classes.loaded, value=13851, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=classes.unloaded, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=gc.ps_scavenge.count, value=11, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=gc.ps_scavenge.time, value=466, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=gc.ps_marksweep.count, value=3, timestamp=Mon Feb 05 16:02:37 EST 2018], Metric [name=gc.ps_marksweep.time, value=459, timestamp=Mon Feb 05 16:02:37 EST 2018]]

我需要编写grok表达式来提取每个名称值对。

有没有办法做到这一点? 任何帮助表示赞赏。

2 个答案:

答案 0 :(得分:0)

我认为你不能为此目的使用grok。

编写自己的过滤器可能是另一种选择。

因此,您可以解析每个事件并将其正确转换为哈希值。

您可以将内联红宝石过滤器作为起点。

https://regexr.com/可能有助于调试正则表达式。

<强>资源

https://www.elastic.co/guide/en/logstash/current/_how_to_write_a_logstash_filter_plugin.html

答案 1 :(得分:0)

考虑到我所做的事情的复杂性,我认为logstash是工作的错误工具,你可以通过真正的程序(或自定义ruby过滤器)更好地服务,但是因为我得到了一个有效的解决方案,我会发布它。

可以像这样使用split和kv过滤器:

{  
   " timestamp":"Mon Feb 05 16:02:37 EST 2018",
   "@timestamp":"2018-02-07T15:01:58.609Z",
   "metric":"name=httpsessions.max, value=-1, timestamp=Mon Feb 05 16:02:37 EST 2018",
   " value":"-1",
   "@version":"1",
   "name":"httpsessions.max"
}
{  
   " timestamp":"Mon Feb 05 16:02:37 EST 2018",
   "@timestamp":"2018-02-07T15:01:58.609Z",
   "metric":"name=httpsessions.active, value=0, timestamp=Mon Feb 05 16:02:37 EST 2018",
   " value":"0",
   "@version":"1",
   "name":"httpsessions.active"
}

例如,第一行的结果是:

message

我已删除了hostMetricsmutate{ remove_field => ["Metrics", "message", "host"]}字段(Sub Change() Range("B2").Value = Range("B2").Value + Range("A2").Value End sub ),以使其可读。