Logstash未在JSON中正确转换

时间:2016-07-11 13:56:28

标签: json elasticsearch logstash

以下是我的json日志文件

[
    {
        "error_message": " Failed to get line from input file (end of file?).", 
        "type": "ERROR", 
        "line_no": "2625", 
        "file": "GTFplainText.c", 
        "time": "17:40:02", 
        "date": "01/07/16", 
        "error_code": "GTF-00014"
    }, 
    {
        "error_message": " Bad GTF plain text file header or footer line. ", 
        "type": "ERROR", 
        "line_no": "2669", 
        "file": "GTFplainText.c", 
        "time": "17:40:02", 
        "date": "01/07/16", 
        "error_code": "GTF-00004"
    }, 
    {
        "error_message": " '???' ", 
        "type": "ERROR", 
        "line_no": "2670", 
        "file": "GTFplainText.c", 
        "time": "17:40:02", 
        "date": "01/07/16", 
        "error_code": "GTF-00005"
    }, 
    {
        "error_message": " Failed to find 'event source'/'product detail' records for event source '3025188506' host event type 1 valid", 
        "type": "ERROR", 
        "line_no": "0671", 
        "file": "RGUIDE.cc", 
        "time": "15:43:48", 
        "date": "06/07/16", 
        "error_code": "RGUIDE-00033"
    }
]

根据我的理解由于日志已经在json中,我们在logstash配置中不需要过滤器部分。以下是我的logstash配置

input {
  file{
    path => "/home/ishan/sf_shared/log_json.json"
    start_position => "beginning"
    codec => "json"
  }
}

,输出配置为

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
  stdout { codec => rubydebug }
}

但似乎数据不会进入ES,因为我在查询索引时无法看到数据。我错过了什么?

1 个答案:

答案 0 :(得分:0)

我认为问题在于json codec期望在一行上有完整的json消息,并且不能在多行上使用消息。

可能的解决方法是使用多行编码并使用json filter 多行编解码器的配置为:

multiline {
  pattern => "]"
  negate => "true"
  what => "next"
}

所有不以]开头的行都将使用下一行重新分组,因此您将有一个完整的json文档提供给json过滤器。