Logstash not_analyzed

时间:2015-06-25 15:29:41

标签: csv logstash kibana elastic-stack

我是ELK堆栈的新手,并且可能尝试设置一个很复杂的配置来开始......: - )

我在Windows 7笔记本电脑上运行整个堆栈。我正在导入一个很好的CSV,但我无法得到字符串字段,因为我在kibana visualisations中给出了破碎的文本。

最后一次尝试是使用模板。

模板和conf文件都位于c:\ logstash-1.5.0 \ bin目录中。

这是conf文件:

input {  
  file {
      path => "C:\Users\jeroen\Documents\temp\CSV\ElasticSearch_Input_vc.csv"
      type => "core2"
      start_position => "beginning"      }
}

filter {  
csv {
    columns => ["snapshot_date_time","Country","Tower","Service","Division","USD Group","Ref Nr","Processtype","Importance","Priority","Severity","Status and Reason","Category","Is_Valid_Category","Summary","Open Date Time","Closed Date Time","Opened By","Last Modified","Resolve Completed Date Time","Hrs_Assigned_To_Completed","First Assign Date Time","Hrs_New_To_Assign","Customer Organization","Requested By","Assignee","Active Flag","In Out SLA Resolution 1"]

    separator => ";"
}
date
{ match => [ "snapshot_date_time", "yyyy-MM-dd HH:mm:ss" ] }
mutate {
convert => { "Hrs_Assigned_To_Completed" => "float" }
convert => { "Hrs_New_To_Assign" => "float" }
  }
}
output {  
elasticsearch {
    action => "index"
    host => "localhost"
    index => "qdb-%{+YYYY.MM.dd}"
    workers => 1
    template => "template.json"
}
#stdout {
   #codec => rubydebug
#}
}

这是模板(老实说,我只是从另一个主题复制并更改了“模板名称”)而且我对如何处理第7行感到疑惑,因为这可能特定于发起人使用的数据。 ..

#template.json:
{
"template": "qdb-%{+YYYY.MM.dd}",
"settings" : {
    "number_of_shards" : 1,
    "number_of_replicas" : 0,
    "index" : {"query" : { "default_field" : "userid" } 
    }
},
"mappings": {
    "_default_": { 
        "_all": { "enabled": false },
        "_source": { "compress": true },
        "dynamic_templates": [
            {
                "string_template" : { 
                    "match" : "*",
                    "mapping": { "type": "string", "index": "not_analyzed" },
                    "match_mapping_type" : "string"
                 } 
             }
         ],
         "properties" : {
            "date" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"},
            "device" : { "type" : "string", "fields": {"raw": {"type":  "string","index": 
"not_analyzed"}}},
            "distance" : { "type" : "integer"}
    }
}
}

感谢任何帮助/提示/提示!

2 个答案:

答案 0 :(得分:0)

您需要的是在通过logstash导入数据后在第一个ElasticSearch中进行映射,然后您将在Kibana中看到您的数据未分析数据

http://host:9200/yourindex/_mapping/yourtype

{
 "your type": {
 "properties": {
  "user" : {
    "type" : "string",
    "index": "not_analyzed",
  "data" : {
    "type" : "string",
    "index": "not_analyzed"
  }
    }

答案 1 :(得分:0)

您可以使用变量“.raw”

例如在我的配置上我将sourceip设置为变量。

在我的可视化中,我可以选择使用sourcip.raw,这将是变量的'not_analyzed'版本。

检查是否存在。