我对动态映射有疑问。我有message
字段json
,并且elasticsearch
使用json
过滤器插件确定该字段的字段是多种多样的。我该怎么办?我想解析它并添加字段-我不在乎这些字段的类型。
我在logstash
中遇到以下错误:
[2019-02-13T13:12:20,087] [WARN] [logstash.outputs.elasticsearch] 无法将事件索引到Elasticsearch。 {:status => 400,:action => [“ index”,{:_id => nil,:_index =>“ filebeat-2019.02.13”,:_type =>“ doc”,:routing => nil}, #],:response => {“ index” => {“ _ index” =>“ filebeat-2019.02.13”, “ _type” =>“ doc”,“ _id” =>“ uhzF5mgBmZ_b74M8qLSn”,“状态” => 400, “错误” => {“类型” =>“ mapper_parsing_exception”,“原因” =>“ [TestJson.payload]的对象映射尝试将字段[payload]解析为对象,但找到了具体值”}}}} < / p>
我的grok文件如下所示:
if [source] =~ ".*request_response\.json$" {
json{
source => "message"
target => "TestJson"
}
if [payload] =~ /{+/ { // check if it is an object
mutate {
add_field => { "type" => "%{[TestJson][type]}" }
add_field => { "payload" => "%{[TestJson][payload]}" }
}
} # end if payload is an object
mutate {
convert => {
"type" => "string"
"payload" => "string"
} # end convert
} # end mutate
} # end if source is json
} # end filter
output {
elasticsearch {
hosts => "localhost:9201"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
} # end elasticsearch
有一个名为message
的json,里面有空的payload
或payload
,例如里面有date
或更复杂的object
。
我认为在message
json中对象不够复杂的情况下会发生此错误。
我正在编写此grok,原因是我想从嵌套的json对象创建字段。 我该如何解决该错误?
编辑:
我在stdout
附近的logstash config
处添加了elasticsearch
并使用命令
journalctl -u logstash.service --since "10 minutes ago" | grep -C 30 'Could not index event'
我在错误旁边获得了这些日志:
Feb 13 15:29:02 f logstash[19144]: [2019-02-13T15:29:02,328][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.02.13", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x4fe8d3ae>], :response=>{"index"=>{"_index"=>"filebeat-2019.02.13", "_type"=>"doc", "_id"=>"Px1C52gBmZ_b74M80KX2", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [TestJson.payload] tried to parse field [payload] as object, but found a concrete value"}}}}
Feb 13 15:29:05 f logstash[19144]: {
Feb 13 15:29:05 f logstash[19144]: "host" => {
Feb 13 15:29:05 f logstash[19144]: "id" => "b",
Feb 13 15:29:05 f logstash[19144]: "name" => "f",
Feb 13 15:29:05 f logstash[19144]: "containerized" => true,
Feb 13 15:29:05 f logstash[19144]: "architecture" => "x86_64",
Feb 13 15:29:05 f logstash[19144]: "os" => {
Feb 13 15:29:05 f logstash[19144]: "family" => "redhat",
Feb 13 15:29:05 f logstash[19144]: "platform" => "centos",
Feb 13 15:29:05 f logstash[19144]: "version" => "7 (Core)",
Feb 13 15:29:05 f logstash[19144]: "codename" => "Core"
Feb 13 15:29:05 f logstash[19144]: }
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "pid" => "27854",
Feb 13 15:29:05 f logstash[19144]: "beat" => {
Feb 13 15:29:05 f logstash[19144]: "hostname" => "f",
Feb 13 15:29:05 f logstash[19144]: "name" => "f",
Feb 13 15:29:05 f logstash[19144]: "version" => "6.5.3"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "message" => "{\"type\":\"Response\",\"payload\":\"2019-02-13T15:29:00.276\"}",
Feb 13 15:29:05 f logstash[19144]: "severity" => "DEBUG",
Feb 13 15:29:05 f logstash[19144]: "parent" => "6fce34dc18cb0e31",
Feb 13 15:29:05 f logstash[19144]: "event" => "",
Feb 13 15:29:05 f logstash[19144]: "span" => "44c9f754c7ca5b58",
Feb 13 15:29:05 f logstash[19144]: "@timestamp" => 2019-02-13T14:29:01.669Z,
Feb 13 15:29:05 f logstash[19144]: "input" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "thread" => "http-nio-9080-exec-54",
Feb 13 15:29:05 f logstash[19144]: "service" => "bi",
--
Feb 13 15:29:05 f logstash[19144]: "@timestamp" => 2019-02-13T14:29:01.669Z,
Feb 13 15:29:05 f logstash[19144]: "service" => "bi",
Feb 13 15:29:05 f logstash[19144]: "prospector" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "thread" => "http-nio-9080-exec-54",
Feb 13 15:29:05 f logstash[19144]: "offset" => 3006421,
Feb 13 15:29:05 f logstash[19144]: "@version" => "1",
Feb 13 15:29:05 f logstash[19144]: "input" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "tags" => [
Feb 13 15:29:05 f logstash[19144]: [0] "beats_input_codec_plain_applied"
Feb 13 15:29:05 f logstash[19144]: ],
Feb 13 15:29:05 f logstash[19144]: "TestJson" => {
Feb 13 15:29:05 f logstash[19144]: "payload" => "",
Feb 13 15:29:05 f logstash[19144]: "url" => "/bi/getTime",
Feb 13 15:29:05 f logstash[19144]: "type" => "Request",
Feb 13 15:29:05 f logstash[19144]: "sessionId" => 476,
Feb 13 15:29:05 f logstash[19144]: "username" => "k",
Feb 13 15:29:05 f logstash[19144]: "lang" => "pl",
Feb 13 15:29:05 f logstash[19144]: "contentType" => "null",
Feb 13 15:29:05 f logstash[19144]: "ipAddress" => "127.0.0.1",
Feb 13 15:29:05 f logstash[19144]: "method" => "POST",
Feb 13 15:29:05 f logstash[19144]: "queryString" => "null"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "trace" => "6fce34dc18cb0e31",
Feb 13 15:29:05 f logstash[19144]: "date" => "2019-02-13 15:29:00,277",
Feb 13 15:29:05 f logstash[19144]: "source" => "/opt/tomcat-bo/logs/bi_request_response.json"
Feb 13 15:29:05 f logstash[19144]: }
Feb 13 15:29:05 f logstash[19144]: [2019-02-13T15:29:05,326][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.02.13", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x1812fa5e>], :response=>{"index"=>{"_index"=>"filebeat-2019.02.13", "_type"=>"doc", "_id"=>"Uh1C52gBmZ_b74M83KWr", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [TestJson.payload] tried to parse field [payload] as object, but found a concrete value"}}}}