使用logstash grok进行数据类型转换

时间:2014-12-18 14:36:42

标签: elasticsearch logstash logstash-grok

Basic是一个浮点字段。弹性搜索中不存在所提到的索引。使用logstash -f运行配置文件时,我没有异常。然而,反映并输入弹性搜索的数据显示Basic作为string的映射。我该如何纠正这个?我如何为多个领域执行此操作?

input {  
      file {
          path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"
          type => "promosms_dec15"
          start_position => "beginning"
          sincedb_path => "/dev/null"
      }
}
filter {
    grok{
        match => [
            "Basic", " %{NUMBER:Basic:float}"
        ]
    }

    csv {
        columns => ["Generation_Date","Basic"]
        separator => ","
    }  
    ruby {
          code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"
    }

}
output {  
    elasticsearch { 
        action => "index"
        host => "localhost"
        index => "promosms-%{+dd.MM.YYYY}"
        workers => 1
    }
}

1 个答案:

答案 0 :(得分:3)

你有两个问题。首先,你的grok过滤器是在csv过滤器之前列出的,因为过滤器的应用顺序是因为它不会成为" Basic"应用grok过滤器时要转换的字段。

其次,除非您明确允许,否则grok不会覆盖现有字段。换句话说,

grok{
    match => [
        "Basic", " %{NUMBER:Basic:float}"
    ]
}

永远都是无操作。指定overwrite => ["Basic"]或者最好使用mutate的类型转换功能:

mutate {
    convert => ["Basic", "float"]
}