Logstash无法索引日期,因为它无法解析日期

时间:2019-11-02 22:52:42

标签: elasticsearch logstash

运行logstash将文档索引到Elasticsearch中时,出现很多以下错误

[2019-11-02T18:48:13,812][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"my-index-2019-09-28", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x729fc561>], :response=>{"index"=>{"_index"=>"my-index-2019-09-28", "_type"=>"doc", "_id"=>"BhlNLm4Ba4O_5bsE_PxF", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestamp] of type [date] in document with id 'BhlNLm4Ba4O_5bsE_PxF'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2019-09-28 23:32:10.586\" is malformed at \" 23:32:10.586\""}}}}}

日期的形成显然有问题,但是我不知道那是什么问题。以下是我的logstash配置和elasticsearch模板的摘录。之所以包含这些内容,是因为我试图通过将timestamp复制到timestamp中,然后将其格式化为@timestamp格式,来使用YYY-MM-DD字段在logstash配置中阐明索引。使用存储的元数据表达我的索引

Logstash配置:

input {
      stdin { type => stdin }
}
filter {
  csv {
     separator => " "   # this is a tab (/t) not just whitespace
     columns => ["timestamp","field1", "field2", ...]
     convert => {
       "timestamp" => "date_time"
       ...
     }
  }
}

filter {
  date {
    match => ["timestamp", "yyyy-MM-dd' 'HH:mm:ss'.'SSS'"]
    target => "@timestamp"
  }
}

filter {
  date_formatter {
    source => "@timestamp"
    target => "[@metadata][date]"
    pattern => "YYYY-MM-dd"
  }
}


filter {
  mutate {
    remove_field => [
      "@timestamp",
      ...
    ]
  }
}

output {
   amazon_es {
     hosts =>
         ["my-es-cluster.us-east-1.es.amazonaws.com"]
     index => "my-index-%{[@metadata][date]}"
     template => "my-config.json"
     template_name => "my-index-*"
     region => "us-east-1"
  }
}

模板:

{
    "template" : "my-index-*",
    "mappings" : {
      "doc" : {
        "dynamic" : "false",
        "properties" : {

          "timestamp" : {
            "type" : "date"
          }, ...
    },
    "settings" : {
      "index" : {
        "number_of_shards" : "12",
        "number_of_replicas" : "0"
      }
    }
}

当我检查原始数据时,看起来错误出在我身上,并且看起来格式正确,所以我不确定我的问题是什么

这是示例行,已删除,但问题字段未更改,是第一个

2019-09-28 07:29:46.454 NA  2019-09-28 07:29:00 someApp 62847957802 62847957802

2 个答案:

答案 0 :(得分:1)

原来的问题是convert块。 logstash无法理解文件中指定的时间格式。为了解决这个问题,我将原来的timestamp字段更改为unformatted_timestamp,并应用了我已经在使用的日期格式程序

filter {
  date {
    match => ["unformatted_timestamp", "yyyy-MM-dd' 'HH:mm:ss'.'SSS'"]
    target => "timestamp"
  }
}

filter {
  date_formatter {
    source => "timestamp"
    target => "[@metadata][date]"
    pattern => "YYYY-MM-dd"
  }
}

答案 1 :(得分:0)

您正在使用csv过滤器解析行并将分隔符设置为一个空格,但是您的日期也会被一个空格分隔,这样,名为timestamp的第一个字段只会获取日期2019-09-28,时间在名为field1的字段上。

例如,您可以创建一个名为date_and_time的新字段来解决您的问题,该字段的内容带有日期和时间。

csv {
    separator => " "
    columns => ["date","time","field1","field2","field3","field4","field5","field6"]
}
mutate {
    add_field => { "date_and_time" => "%{date} %{time}" }
}
mutate {
    remove_field => ["date","time"]
}

这将创建一个名为date_and_time且值为2019-09-28 07:29:46.454的字段,您现在可以使用date过滤器将此值解析为@timestamp字段,默认为logstash。

date {
    match => ["date_and_time", "YYYY-MM-dd HH:mm:ss.SSS"]
}

这将为您提供两个具有相同值的字段date_and_time@timestamp@timestamp是logstash的默认值,因此我建议保留它并删除{{1 }}。

date_and_time

现在,您可以使用mutate { remove_field => ["date_and_time"] } 格式创建基于日期的索引,logstash将从YYYY-MM-dd字段中提取日期,只需为此在输出中更改@timestamp行:

index