我有1000行和3列的CSV文件,如下所示:
field1, field2, field3
ABC A65 ZZZ
...
我想将其内容导出到索引myrecords
的映射myindex
中(我在此索引中有更多映射):
PUT /myindex
{
"mappings": {
"myrecords": {
"_all": {
"enabled": false
},
"properties": {
"field1": { "type": "keyword" },
"field2": { "type": "keyword" },
"field3": { "type": "keyword" }
}
}
}
}
有没有简单的方法呢?
更新
我执行了这个Logstash配置文件,但是虽然CSV的大小很小(1000个条目),但该过程是永久运行的。当我执行GET /myindex/myrecords/_search
时,我一直只看到1条记录。
input {
file {
path => ["/usr/develop/data.csv"]
start_position => beginning
}
}
filter {
csv {
columns => ["field1","field2","field3"]
separator => ","
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "myindex"
document_type => "myrecords"
document_id => "%{Id}" // Here I also tried "%{field1}"
workers => 1
}
}