配置文件:
# input are the kafka messages
input
{
kafka
{
topic_id => 'test2'
}
}
# Try to match sensor info
filter
{
json { source => "message"}
}
# StatsD and stdout output
output
{
stdout
{
codec => line
{
format => "%{[testmessage][0][key]}"
}
}
stdout { codec=>rubydebug }
statsd
{
host => "localhost"
port => 8125
increment => ["test.%{[testmessage][0][key]}"]
}
}
输入kafka消息:
{"testmessage":[{"key":"key-1234"}]}
输出:
key-1234
{
"testmessage" => [
[0] {
"key" => "key-1234"
}
],
"@version" => "1",
"@timestamp" => "2015-11-09T20:11:52.374Z"
}
日志:
{:timestamp=>"2015-11-09T20:29:03.562000+0000", :message=>"Done running kafka input", :level=>:info}
{:timestamp=>"2015-11-09T20:29:03.563000+0000", :message=>"Plugin is finished", :plugin=><LogStash::Outputs::Stdout codec=><LogStash::Codecs::Line format=>"%{[testmessage][0][key]}", charset=>"UTF-8">, workers=>1>, :level=>:info}
{:timestamp=>"2015-11-09T20:29:03.564000+0000", :message=>"Plugin is finished", :plugin=><LogStash::Outputs::Statsd increment=>["test1.test", "test.%{[testmessage][0][key]}"], codec=><LogStash::Codecs::Plain charset=>"UTF-8">, workers=>1, host=>"localhost", port=>8125, namespace=>"logstash", sender=>"%{host}", sample_rate=>1, debug=>false>, :level=>:info}
{:timestamp=>"2015-11-09T20:29:03.564000+0000", :message=>"Pipeline shutdown complete.", :level=>:info}
非常有线,为什么statsd在我的logstash中不起作用。查看Google的大量示例,不知道为什么。欢迎任何建议。感谢。
答案 0 :(得分:0)
我找到了原因,logstash-output-statsd默认使用UDP。但我的statsd服务器设置为使用TCP。