无法将Elasticsearch数据导出到CSV文件

时间:2020-09-18 11:57:38

标签: csv elasticsearch logstash

我正在尝试将Elasticsearch数据导出到CSV文件

关注以下链接

https://qbox.io/blog/how-to-export-data-elasticsearch-into-csv-file

并安装了logstash-input-elasticsearch,logstash-output-csv插件

创建了给定的配置文件,并尝试运行conf文件。我得到的错误很少,当googled发现不再使用“ filtered”时,请按如下所示更改配置文件

input {
 elasticsearch {
    hosts => "localhost:9200"
    index => "collab*"
    query => '
  {"query": {
     "bool":{
      "must": {
        "match": {
          "text": "*"
        }
      },
      "filter": {
        "bool": {
          "must": [
            {
              "range": {
                "@timestamp": {
                  "gte": 1600425000000,
                  "lte": 1600428600000,
                  "format": "epoch_millis"
                }
              }
            }
          ],
          "must_not": []
        }
      }
         }
}}'
  }
}
output {
  csv {
    fields => ["name", "type", "count","scripted","searchable"]
    path => "/tmp/csv-export.csv"
  }
}

现在我得到以下日志

Sending Logstash logs to /home/criuser/elasticsearch/logstash-7.6.2/logs which is now configured via log4j2.properties
[2020-09-18T11:46:45,101][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-09-18T11:46:45,696][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
[2020-09-18T11:46:54,890][INFO ][org.reflections.Reflections] Reflections took 285 ms to scan 1 urls, producing 20 keys and 40 values
[2020-09-18T11:46:57,707][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-09-18T11:46:57,802][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/home/criuser/elasticsearch/logstash-7.6.2/config/output-csv.conf"], :thread=>"#<Thread:0x7beeb2a9 run>"}
[2020-09-18T11:47:04,405][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-09-18T11:47:04,893][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-09-18T11:47:07,304][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-09-18T11:47:10,102][INFO ][logstash.runner          ] Logstash shut down.

但无法在/ tmp文件夹中找到输出文件。

请您帮我一下。我不确定配置是否正确。如果它不正确,可以帮助您更改该文件的方式。以及如何使导出的日志在文件中可用。.

预先感谢

注意:logstash版本和elasticsearch版本均为7.6.2

0 个答案:

没有答案