Logstash csv导入

时间:2015-08-13 03:27:50

标签: csv elasticsearch logstash kibana-4

我使用的是Ubuntu 14.04 LTS,Kibana,Logstash和Elasticsearch。我尝试使用以下代码将我的csv文件导入LogStash,但它没有检测到。

/assets/global/plugins

我甚至做过

input 
{
    file 
    {
        path => "/home/kibana/Downloads/FL_insurance_sample.csv"
        type => "FL_insurance_sample.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter 
{
    csv 
    {
    columns => ["policyID","statecode","country","eq_site_limit","hu_site_limit",
        "fl_sitelimit","fr_site_limit","tiv_2011","tiv_2012","eq_site_deductible",
        "hu_site_deductible","fl_site_deductible","fr_site_deductible","point_latitude",
        "point_longtitude","line","construction","point_granularity"]
        separator => ","
    }
}

output 
{
    elasticsearch {
        action => "index"
        host => "localhost"
        index => "promosms-%{+dd.MM.YYYY}"
        workers => 1
    }
    stdout
    {
        codec => rubydebug
    }

}

当我进入Kibana GUI界面的索引映射时,我选择了Logstash- *并且无法找到我想要的数据。 附:我的配置文件存储在/etc/logstash/conf.d/simple.conf

1 个答案:

答案 0 :(得分:0)

在你的问题中,你说你去了Kibana的Logstash-*,但是你的配置文件说明你要将数据放入promosms-%{+dd.MM.YYYY}

您需要进入kibana4的设置部分并将[promosms-]DD.MM.YYYY放入索引名称或模式框,并检查“索引包含基于时间的事件”和“使用事件时间创建索引名称”。

然后您可能还想将其设置为默认索引。

相关问题