如何使用Logstash / Filebeat和Grok过滤JSON

时间:2017-04-28 07:39:14

标签: logstash logstash-grok filebeat

我对这些技术都很陌生,目前我在一个多星期左右的时间里摸不着头脑,找到合适的答案。我有一个日志文件,如:

"2017-04-13 17:15:34.649 INFO  [http-bio-8080-exec-5] Adapter:132 |Empty|Empty|===Request object=== GetTransKey=============
"2017-04-13 17:15:34.699 INFO  [http-bio-8080-exec-5] Adapter:133 |Empty|Empty|Request object : sessionId:null,  busiCode:GetTransKey,  reqPubInfo:{"appId":"com.info.tss","sessionId":null,"version":"10000","timestamp":"20150206165957","lang":"EN","userId":null,"serviceId":null,"circleId":null,"route":null,"customerId":null,"osType":null}, param:{"type":0,"key":"MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCKmsCyw+YomiNbvkUP3D7OtvOMd7jq0aNa0APSp5E5PsYW7fpaUMniWkQeAwD3EmhzF5v3oXGA2bqAZ+b0ZJgv2BoEGYPoaCzOZBglDzUe8xldK5mMJHLiMwL0enkwURQvubnTUAxXMS0SPcXq4/jyX9mBu27Ht+zjT8Y3vO51JwIDAQAB","deviceInfo":null}
"2017-04-13 17:15:34.699 INFO  [http-bio-8080-exec-5] Adapter:137 |Empty|Empty|Event:GetTransKey|StartTime:1492083934699ms
"2017-04-13 17:15:34.713 DEBUG [http-bio-8080-exec-5] RedisCache:72 |Empty|Empty|===mode=1 Redis cache connect to host:10.135.25.108 port:28333
"2017-04-13 17:15:34.720 DEBUG [http-bio-8080-exec-5] RedisCache:159 |Empty|Empty|{"lifo":true,"fairness":false,"maxWaitMillis":20,"minEvictableIdleTimeMillis":60000,"softMinEvictableIdleTimeMillis":1800000,"numTestsPerEvictionRun":-1,"evictionPolicyClassName":"org.apache.commons.pool2.impl.DefaultEvictionPolicy","testOnCreate":false,"testOnBorrow":false,"testOnReturn":true,"testWhileIdle":true,"timeBetweenEvictionRunsMillis":30000,"blockWhenExhausted":true,"jmxEnabled":true,"jmxNamePrefix":"pool","jmxNameBase":null,"maxTotal":50,"maxIdle":10,"minIdle":0}
"2017-04-13 17:15:42.830 INFO  [http-bio-8080-exec-5] Adapter:145 |Empty|Empty|Event:GetTransKey|End Time:1492083942830ms|Total Time:8131ms|Status:0
"2017-04-13 17:15:42.831 INFO  [http-bio-8080-exec-5] Adapter:148 |Empty|Empty|===Resp data===  GetTransKey=============
"2017-04-13 17:15:42.831 INFO  [http-bio-8080-exec-5] Adapter:149 |Empty|Empty|Resp object : sessionId:null,  busiCode:GetTransKey,  respData:{"transKey":"W73GHuCMhSXnihDxlBA/QKzbF4dhqZlLWylINlvi4Ben1ViECepll2zL7Az489Uk4/e0HsT3/zkG\nSyIB9M9EDbp9rLqZIARCcBRUIYJ/N3YIDrQSvD7SyoIjg+ti/my17U/TLVgi3BLPkMQw9/0XhNpA\n/LYePHed2pe0FYun3xo=","sessionId":"216bc5f3-cdec-4998-9494-717c8e3769a6"}

在这里,我只对两个JSON对象感兴趣,这些对象是reqPubInforespData但是我无法确定如何解析这些,文档看起来像是一个海洋。请指导我如何只解析日志中的JSON对象。

到目前为止,我的配置文件如下所示:

input {
    beats {
        port => "5043"
    }

}
filter {
json {
    source => "message"
  }
}
output {
    stdout { codec => rubydebug }
}

1 个答案:

答案 0 :(得分:1)

您需要使用grok filter解析message,然后才能应用JSON filter。如果您可以修改应用程序的记录器配置以输出纯JSON,那么最好不需要grok解析。

filter {
  # Parse the log message.
  grok {
    pattern_definitions => {
      "LOGDATE"   => "[\d]{4}-[\d]{2}-[\d]{2} %{TIME}"
      "LOGHEADER" => "%{LOGDATE:logdate} %{LOGLEVEL:level}\s+\[%{GREEDYDATA:thread}\] %{NOTSPACE:file}:%{NUMBER:line}\s?"
    }   
    match => {
      message => [
        "%{LOGHEADER} %{GREEDYDATA:message} reqPubInfo:%{GREEDYDATA:reqPubInfo}, param:%{GREEDYDATA:param}",
        "%{LOGHEADER} %{GREEDYDATA:message} respData:%{GREEDYDATA:respData}",
        "%{LOGHEADER} %{GREEDYDATA:message}"
      ]   
    }   
    overwrite => [ "message" ]
  }   

  # Set @timestamp using the date in the log message.
  date {
    match => [ "logdate", "yyyy-MM-dd HH:mm:ss.SSS" ]
    remove_field => [ "logdate" ]
  }   

  # Parse the JSON data.
  if [reqPubInfo] {
    json {
      source => "reqPubInfo"
      target => "reqPubInfo"
    }   
    json {
      source => "param"
      target => "param"
    }   
  } else if [respData] {
    json {
      source => "respData"
      target => "respData"
    }   
  }   
}

我可以找到用于测试的自包含配置here