我正在尝试导入csv文件以在我的elasticsearch服务器上创建数据以进行测试。
但我阻止使用配置文件导入数据
这是一个命令(在winodws上)logstash -f file.config
这是我的配置文件
input{
file {
path => "/E:/Formation/kibana/data/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["maker","model","mileage","manufacture_year","engine_displacement",
"engine_power","body_type","color_slug","stk_year","transimission","door_count",
"seat_count","fuel_type","date_created","date_last_seen","price_eur"]
}
mutate {
convert => ["mileage","integer"]
convert => ["price_eur","float"]
convert => ["engine_power","integer"]
convert => ["door_count","integer"]
convert => ["seat_count","integer"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type=> "sold_cars"
}
stdout { }
}
这是错误 更新这是使用模式后的日志--debug感谢帮助
16:49:29.252 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:34.257 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:39.257 [Ruby-0-Thread-11: E:/Formation/kibana/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:532] DEBUG logstash.pipeline - Pushing flush onto pipeline
16:49:43.663 [[main]<file] DEBUG logstash.inputs.file - _globbed_files: /e/Formation/kibana/data/cars.csv: glob is: []
答案 0 :(得分:0)
在Windows上,您应该使用sincedb_path => "nul"
而不是sincedb_path => "/dev/null"
,后者在基于Linux的操作系统上使用。