使用Logstash解析JSON以将其插入Elasticsearch

时间:2017-11-01 10:24:48

标签: json elasticsearch logstash

我想实现一个logstash实例读取JSON文件,并将其发送到elasticsearch集群。我使用以下管道配置文件:

input
{
    file
    {
            path => ["/path/to/proba.json"]
            type => "json"
            start_position => "beginning"
            ignore_older => 0

    }
}

output
{
    elasticsearch
    {
            hosts => "http://172.19.238.10:9200"
            index => "proba"
    }
}

我希望,这是一个名为' proba'的新索引。出现在包含JSON索引的elasticsearch中,但实际上并没有发生任何事情。 logstash日志:

[2017-11-01T10:12:46,540][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-11-01T10:12:46,542][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-11-01T10:12:46,553][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/x-pack-5.6.0-java/modules/arcsight/configuration"}
[2017-11-01T10:12:47,042][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.19.238.10:9200/]}}
[2017-11-01T10:12:47,042][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.19.238.10:9200/, :path=>"/"}
[2017-11-01T10:12:47,089][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.19.238.10:9200/"}
[2017-11-01T10:12:47,089][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-01T10:12:47,113][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-01T10:12:47,116][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://172.19.238.10:9200"]}
[2017-11-01T10:12:47,121][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.19.238.10:9200/]}}
[2017-11-01T10:12:47,121][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.19.238.10:9200/, :path=>"/"}
[2017-11-01T10:12:47,124][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.19.238.10:9200/"}
[2017-11-01T10:12:47,127][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-01T10:12:47,130][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-01T10:12:47,133][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://172.19.238.10:9200"]}
[2017-11-01T10:12:47,138][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.19.238.10:9200/]}}
[2017-11-01T10:12:47,139][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.19.238.10:9200/, :path=>"/"}
[2017-11-01T10:12:47,145][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.19.238.10:9200/"}
[2017-11-01T10:12:47,146][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-01T10:12:47,149][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-01T10:12:47,152][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://172.19.238.10:9200"]}
[2017-11-01T10:12:47,155][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.19.238.10:9200/]}}
[2017-11-01T10:12:47,156][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.19.238.10:9200/, :path=>"/"}
[2017-11-01T10:12:47,158][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.19.238.10:9200/"}
[2017-11-01T10:12:47,159][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-01T10:12:47,165][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-01T10:12:47,170][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://172.19.238.10:9200"]}
[2017-11-01T10:12:47,173][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-11-01T10:12:47,306][INFO ][logstash.pipeline        ] Pipeline main started
[2017-11-01T10:12:47,349][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

proba.json的内容如下:

{"name": "nobody","age": "1000"}

版本5.6.0同时使用elasticsearch和logstash。

知道出了什么问题吗?

谢谢!

0 个答案:

没有答案