给出以下多行日志
{
"code" : 429
}
以及以下管道logstash.conf
filter {
grok {
match =>
{
"message" =>
[
"%{GREEDYDATA:json}"
]
}
}
json {
source => "json"
target => "json"
}
}
何时通过文件拍将日志发送到logstash中
然后 Logstash返回
[2018-08-07T10:48:41,067][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-to-logstash", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2bf7b08d>], :response=>{"index"=>{"_index"=>"filebeat-to-logstash", "_type"=>"doc", "_id"=>"trAAFGUBnhQ5nUWmyzVg", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [json]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:3846"}}}}}
这是错误的行为,因为JSON完全有效,应如何解决?
答案 0 :(得分:0)
我发现在Logstash 6.3.0
中,当人们尝试在"json"
字段上序列化JSON时会发生此问题。将该字段名称更改为其他名称可以解决此问题。
由于Elastic JSON filter plugin documentation没有提及此行为,并且错误消息不正确,因此可以假定这是一个错误。