将csv文件导入弹性搜索

时间:2017-07-28 11:27:17

标签: elasticsearch logstash

我正在尝试将一个巨大的csv文件导入弹性serach.Trying使用logstash相同。 示例csv文件[注意:多个值]

Shop_name,Review_Title,Review_Text,,,,
Accord ,Excellent ,Nice Collection.,,,,,
Accord , Bad ,Not too comfortable,,,
Accord , Good ,excellent location and Staff,,,
Accord , Good ,Great Colletion,,,
Shopon,good,  staff very good ,,,
Harrisons ,Spacious,Nice Colletion

Logstash配置

input {
    file {
        path => ["shopreview.csv"]
        start_position => "beginning"
    }
}

filter {
    csv {
        columns => [
            "Shop_name",
            "Review_Title",
            "Review_Text"

        ]

    }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        action => "index"
        hosts => ["127.0.0.1:9200"]
        index => "reviews"
        document_type => "shopreview"
        document_id => "%{Shop_name}"
        workers => 1
    }
}

在这里,当我查询评论时,我应该获得特定商店的所有评论。

问题 当我用localhost查询时:9020 / review / shopreview / Accord我没有得到所有的值。只有1个条目。配置缺少什么。我是elk stack的新手

0 个答案:

没有答案