我尝试在Logstash上使用CSV过滤器,但它可以上传我的文件的值。 我使用的是Ubuntu Server 14.04,kibana 4,logstash 1.4.2和elasticsearch 1.4.4。 接下来,我展示了我写的CSV文件和过滤器。我做错了吗?
CSV文件:
Joao,21,555
Miguel,24,1000
Rodrigo,43,443
Maria,54,2343
Antonia,67,213
Logstash CSV过滤器:
#Este e filtro que le o ficheiro e permite alocar os dados num index do Elasticsearch
input
{
file
{
path => ["/opt/logstash/bin/testeFile_lite.csv"]
start_position => "beginning"
# sincedb_path => "NIL"
}
}
filter
{
csv
{
columns => ["nome", "idade", "salario"]
separator => ","
}
}
output
{
elasticsearch
{
action => "index"
host => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
}
stdout
{
codec => rubydebug
}
}
当我执行过滤器时,显示:使用里程碑2输入插件'文件' ...并使用里程碑2输入插件' csv' ...并且确定消息不是&#39 ; t出现。
有人可以帮助我吗?
答案 0 :(得分:4)
我解决了在输入文件中添加字段sincedb_path的问题。
以下是Logstash CSV过滤器:
input
{
file
{
path => "/opt/logstash/bin/testeFile_lite.csv"
type => "testeFile_lite"
start_position => "beginning"
sincedb_path => "/opt/logstash/bin/dbteste"
}
}
filter
{
csv
{
columns => ['nome', 'idade', 'salario']
separator => ","
}
}
output
{
elasticsearch
{
action => "index"
host => "localhost"
index => "xpto"
cluster => "SIC_UTAD"
}
stdout
{
codec => rubydebug
}
}