使用Logstash将Elasticsearch转换为CSV

时间:2019-08-08 10:00:41

标签: elasticsearch logstash

我正在尝试使用logstash将elasticsearch索引数据导出到CSV,以下是我的配置:-

<code>
input {
   elasticsearch {
    hosts => "localhost:9200"
    index => "index_name"
    query => '
    {
            "query": {
                    "match_all": {}
            }
    }
  '
  }
}
output {
 csv {
    # elastic field name
    fields => [ "filed1","filed2"]
    # This is path where we store output.
    path => "/etc/logstash/scrap/index_name.csv"
  }

}
</code>

但是,csv中只有部分数据而不是全部数据,下面是日志文件的输出:- [INFO ] 2019-08-08 13:27:01.368 [Ruby-0-Thread-22@[main]>worker18: :1] csv - Opening file {:path=>"/etc/logstash/scrap/index_name.csv"} [INFO ] 2019-08-08 13:30:29.187 [Ruby-0-Thread-26@[main]>worker22: :1] csv - Closing file /etc/logstash/scrap/index_name.csv [INFO ] 2019-08-08 13:30:30.776 [Ruby-0-Thread-24@[main]>worker20: :1] csv - Opening file {:path=>"/etc/logstash/scrap/index_name.csv"} [INFO ] 2019-08-08 13:31:56.951 [[main]-pipeline-manager] pipeline - Pipeline has terminated {:pipeline_id=>"main", :thread=>"#"} 有多个打开文件和关闭文件,这可能是问题,请建议如何解决此问题。

0 个答案:

没有答案