如何解决Logstash中CSV文件的解析错误

时间:2020-05-27 12:34:17

标签: ruby csv logstash filebeat

我正在使用Filebeat将CSV文件发送到Logstash,然后发送到Kibana,但是当Logstash拾取CSV文件时,我遇到了解析错误。

这是CSV文件的内容:

time    version id  score   type

May 6, 2020 @ 11:29:59.863  1 2 PPy_6XEBuZH417wO9uVe  _doc

logstash.conf:

input {
  beats {
    port => 5044
  }
}
filter {
  csv {
      separator => ","
      columns =>["time","version","id","index","score","type"]
      }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

Filebeat.yml:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /etc/test/*.csv
    #- c:\programdata\elasticsearch\logs\*

和Logstash中的错误:

[2020-05-27T12:28:14,585][WARN ][logstash.filters.csv     ][main] Error parsing csv {:field=>"message", :source=>"time,version,id,score,type,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,", :exception=>#<TypeError: wrong argument type String (expected LogStash::Timestamp)>}
[2020-05-27T12:28:14,586][WARN ][logstash.filters.csv     ][main] Error parsing csv {:field=>"message", :source=>"\"May 6, 2020 @ 11:29:59.863\",1,2,PPy_6XEBuZH417wO9uVe,_doc,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,", :exception=>#<TypeError: wrong argument type String (expected LogStash::Timestamp)>}

我确实在Kibana中获得了一些数据,但没有得到我想要的。

enter image description here

1 个答案:

答案 0 :(得分:2)

我设法使其在本地工作。到目前为止,我注意到的错误是:

  1. 使用@timestamp@version等ES保留字段。
  2. 时间戳记不是ISO8601格式。中间有一个@符号。
  3. 您的过滤器将分隔符设置为,,但是CSV实际分隔符为"\t"
  4. 根据该错误,您可以看到它也试图在标题行上工作,建议您将其从CSV中删除,或使用skip_header选项。

下面是我使用的logstash.conf文件:

input {
    file {
        path => "C:/work/elastic/logstash-6.5.0/config/test.csv"
        start_position => "beginning"
    } 
}
filter { 
    csv { 
        separator => ","
        columns =>["time","version","id","score","type"]
    } 
} 
output { 
    elasticsearch { 
        hosts => ["localhost:9200"]
        index => "csv-test" 
    } 
}

我使用的CSV文件:

May 6 2020 11:29:59.863,1,PPy_6XEBuZH417wO9uVe,_doc
May 6 2020 11:29:59.863,1,PPy_6XEBuZH417wO9uVe,_doc
May 6 2020 11:29:59.863,1,PPy_6XEBuZH417wO9uVe,_doc
May 6 2020 11:29:59.863,1,PPy_6XEBuZH417wO9uVe,_doc

从我的Kibana:

enter image description here