我正在使用Elasticsearch-1.5.1,Kibana-4.0.2-linux-x86,Logstash-1.4.2。 我的 logstash conf 就像这样
input{
redis{
data_type=>'list'
key=>'pace'
password=>'bhushan'
type=>pace
}
}filter {
geoip {
source => "mdc.ip"
target => "geoip"
database => "/opt/logstash-1.4.2/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
}
output{
if[type]=="pace"{
elasticsearch{
template_overwrite => true
host=>localhost
index=>'pace'
template => "/opt/logstash-1.4.2/mytemplates/elasticsearch-template.json"
template_name => "bhushan"
}
}
stdout{
codec=>rubydebug
}
}
{
"template" : "bhushan",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true
"properties" : {
"location" : { "type" : "geo_point" }
}
}
}
}
}
}
当我做url curl http://localhost:9200/pace/_mapping/pace/field/geoip.location?pretty
时{
"pace" : {
"mappings" : {
"pace" : {
"geoip.location" : {
"full_name" : "geoip.location",
"mapping" : {
"location" : {
"type" : "double"
}
}
}
}
}
}
}
日志记录示例如
{
"thread_name" => "main",
"mdc.ip" => "14.X.X.X",
"message" => "Hii, I m in info",
"@timestamp" => "2015-05-15T10:18:32.904+05:30",
"level" => "INFO",
"file" => "Test.java",
"class" => "the.bhushan.log.test.Test",
"line_number" => "15",
"logger_name" => "bhushan",
"method" => "main",
"@version" => "1",
"type" => "pace",
"geoip" => {
"ip" => "14.X.X.X",
"country_code2" => "IN",
"country_code3" => "IND",
"country_name" => "India",
"continent_code" => "AS",
"region_name" => "16",
"city_name" => "Mumbai",
"latitude" => 18.974999999999994,
"longitude" => 72.82579999999999,
"timezone" => "Asia/Calcutta",
"real_region_name" => "Maharashtra",
"location" => [
[0] 72.82579999999999,
[1] 18.974999999999994
],
"coordinates" => [
[0] "72.82579999999999",
[1] "18.974999999999994"
]
}
}
我认为我的问题与this相同,所以我在该链接中提到了所有内容,例如删除所有旧索引并重新启动LS和ES但没有运气。 任何帮助表示赞赏。
答案 0 :(得分:1)
您的logstash过滤器将坐标存储在字段geoip.coordinates
中,但在elasticsearch-template.json
映射中,该字段称为geoip.location
。这会显示在您的示例日志记录中,您可以在location
子对象中看到两个字段coordinates
和geoip
。
我认为如果你在logstash过滤器中更改了这个,那么你可能会很好:
从此
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
到此
add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][location]", "%{[geoip][latitude]}" ]
<强>更新强>
add_field
过滤器中的两个geoip
指令可以删除,因为它们是不必要的"path": "full"
可以删除,因为自ES v1.0 pace
而不是bushan
,即存储日志记录的索引名称。