CSV数据使用Logstash保存Elasticsearch

时间:2017-09-22 10:56:07

标签: csv elasticsearch logstash elasticsearch-plugin elasticsearch-5

我想使用Logstash在Elasticsearch中保存CSV数据,以获得以下结果:

"my_field": [{"col1":"AAA", "col2": "BBB"},{"col1":"CCC", "col2": "DDD"}]

因此,将CSV数据保存为特定文档中的数组[...]非常重要。

然而,我得到了这个结果:

      "path": "path/to/csv",
      "@timestamp": "2017-09-22T11:28:59.143Z",
      "@version": "1",
      "host": "GT-HYU",
      "col2": "DDD",
      "message": "CCC,DDD",
      "col1": "CCC"

看起来只有最后一行CSV行被保存(因为覆盖)。我尝试在Logstash中使用document_id => "1",但它显然会引发覆盖。如何在阵列中保存数据? 此外,我不明白如何定义数据保存在my_field

input {
    file {
        path => ["path/to/csv"]
        sincedb_path => "/dev/null"
        start_position => beginning
    }
}

filter {  
    csv {
        columns => ["col1","col2"]
        separator => ","
    }
    if [col1] == "col1" {
    drop {}
    }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        action => "update"
        hosts => ["127.0.0.1:9200"]
        index => "my_index"
        document_type => "my_type"
        document_id => "1"
        workers => 1
    }
}

0 个答案:

没有答案