在INSERTS

时间:2015-08-18 19:47:19

标签: elasticsearch logstash logstash-configuration

我有一个像这样的logstash配置:

input {
  file {
    path => ["/home/csdata/*.data"]
    codec => json {
      }
    start_position => "beginning"
    discover_interval => 5
 }
}
output{
if [_up] == 1 {
 elasticsearch {
        protocol => "http"
        host => "[myelasticsearchip]"
        cluster => "clustername"
        flush_size => 50
        index => "%{_index}"
        action => "update"
        document_id => "%{_id}"
        index_type => "%{_type}"
        }
}
else if [_id] != "" {
  elasticsearch {
        protocol => "http"
        host => "[myelasticsearchip]"
        cluster => "clustername"
        flush_size => 50
        index => "%{_index}"
        document_id => "%{_id}"
        index_type => "%{_type}"
        }
 }
else{
  elasticsearch {
       protocol => "http"
        host => "[myelasticsearchip]"
        cluster => "clustername"
        index => "%{_index}"
         flush_size => 50
        index_type => "%{_type}"
        }
    }
}

我有很多

failed action with response of 404, dropping action:

数据应该全部按顺序进入同一个文件,因此应该在更新之前创建。所有项目都不会发生这种情况,但需要充足。我希望没有这些错误。

这是因为flush_sizes不同吗?虽然这些项目在原始文件中是有序的,但意味着INSERT始终在更新之前。

任何想法都将不胜感激!

1 个答案:

答案 0 :(得分:0)

也许这可以帮助

我有同样的问题,发现了elasticsearch日志 404 index_not_found_exception 我的解决方案是启用弹性自动创建索引

因为我的索引看起来像logstash_api-20160112

在elasticsearch.yml中添加

action.auto_create_index:+ logstash_api - *