我有CSV文件,我想使用Logstash将其导入Elasticsearch 5.0.0。
这是CSV文件的前两行:
Id,CameraId,ZoneId,Latitude,Longitude,Number,OriginalNumber,Country,OriginalCountry,CapturedOn,Status,Direction,Speed,Confidence,AvgDigitsHeight,MultiplateRate,ProcessingTimeOCR,Signaled,OcrImageId,EnvImageIds,CapturerId,CapturerType,IsAlarm,AlarmListIds,ReplicationId,ImagesUploaded
111,24,1,42.8,3.5,XXDFR,XXDFR,DE,DE,2017-03-04 12:06:20.0,0,1,0,99,21.0,3,16.0193003809306,0,0,[],null,null,0,[],0,0
我运行这个Logstash脚本:
input {
file {
path => ["/usr/develop/test.csv"]
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
columns => [
"Id","CameraId","ZoneId","Latitude","Longitude,"Number","OriginalNumber","Country","OriginalCountry","CapturedOn","Status","Direction","Speed","Confidence","AvgDigitsHeight","MultiplateRate","ProcessingTimeOCR","Signaled","OcrImageId","EnvImageIds","CapturerId","CapturerType","IsAlarm","AlarmListIds","ReplicationId","ImagesUploaded"
]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "mytest"
document_type => "doc"
document_id => "%{Id}"
workers => 1
}
}
我收到此错误:
logstash.agent]获取了无效的配置{:config =>"输入 {\ nfile {\ npath => [\" /usr/develop/test.csv \"] \ ntype => \" core2 \" \ nstart_position => \"开始\" \ n} \ n} \ nfilter {\ ncsv {\ nseparator => \",\" \ ncolumns => [\"标识\" \" CameraId \" \"了zoneid \" \"纬度\" \ "经度,\"数\" \" OriginalNumber \" \"国家\" \" OriginalCountry \& #34; \" CapturedOn \"] \ N} \ N} \ noutput {\ nelasticsearch {\ naction => \" index \" \ nhosts => [\" localhost:9200 \"] \ nindex => \" test \" \ ndocument_type => \" doc \" \ ndocument_id => \"%{Id} \" \ n \ nworkers => 1 \ n} \ nstdout {codec => rubydebug} \ n} \ n \ n",:reason =>"第11行的#,{,,,]之一 过滤后的第61列(字节225){\ ncsv {\ nseparator => \",\" \ ncolumns => [\"标识\" \" CameraId \" \"了zoneid \" \"纬度\" \ "经度,\""}
答案 0 :(得分:1)
不确定你是否抓住了这个,但是因为你错过了一个"列名称"经度"