如何编写logstash配置文件以从日志文件中分离出两个不同的(S:Info& S:Warn)字符串并在Kibana中显示相应的计数?
尝试使用' grep'在logstash中过滤但不确定在Kibana中获取两个不同字符串(Info和Warn)的计数。
下面是日志文件的片段:
Apr 23 21:34:07 LogPortSysLog: T:2015-04-23T21:34:07.276 N:933086 S:Info P:WorkerThread0#783 F:USBStrategyBaseAbs.cpp:724 D:T1T: Power request disabled for this cable. Defaulting to 1000mA
Apr 23 21:34:10 LogPortSysLog: T:2015-04-23T21:34:10.570 N:933087 S:Warn P:DasInterfaceThread#791 F:USBStrategyBaseAbs.cpp:1696 D:CP_CONTROL:Unexpected DasChildTag: 27 B:{}
答案 0 :(得分:2)
您需要grok
过滤器。我不一定得到整个格式,但这些是我的猜测:
Apr 23 21:34:07 LogPortSysLog: T:2015-04-23T21:34:07.276 N:933086 S:Info P:WorkerThread0#783 F:USBStrategyBaseAbs.cpp:724 D:T1T: Power request disabled for this cable. Defaulting to 1000mA
这转换为:
LOG_TIMESTAMP LOG_NAME: T:ACTUAL_TIMESTAMP N:LOGGED_EVENT_NUMBER S:SEVERITY P:THREAD_NAME F:FILENAME:LINE_NUMBER D:MESSAGE
我似乎将一些其他信息聚集到MESSAGE
,但这应该可以让你开始。
文件:
data.log
包含您输入的两行。 portlogs.conf
包含Logstash"配置"解析日志。
input {
# You can change this to the file/other inputs
stdin { }
}
filter {
grok {
# "message" is the field name filled in by most inputs with the
# current line to parse
# Note: I throw away the log's timestamp and use the message timestamp,
# which may not be true for all of your logs!
match => [
"message",
"%{SYSLOGTIMESTAMP} %{DATA:name}: T:%{TIMESTAMP_ISO8601:timestamp} N:%{INT:log_number:int} S:%{DATA:severity} P:%{DATA:thread} F:%{DATA:filename}:%{INT:line_number:int} D:%{GREEDYDATA:log_message}"
]
}
}
output {
# Change this to go to your Elasticsearch cluster
stdout {
codec => rubydebug
}
}
结合两者,使用Logstash,我得到输出(运行Logstash 1.5 RC3,但RC4本周出来):
{
"message" => "Apr 23 21:34:07 LogPortSysLog: T:2015-04-23T21:34:07.276 N:933086 S:Info P:WorkerThread0#783 F:USBStrategyBaseAbs.cpp:724 D:T1T: Power request disabled for this cable. Defaulting to 1000mA",
"@version" => "1",
"@timestamp" => "2015-04-24T01:34:07.276Z",
"host" => "Chriss-MBP-2",
"name" => "LogPortSysLog",
"log_number" => 933086,
"severity" => "Info",
"thread" => "WorkerThread0#783",
"filename" => "USBStrategyBaseAbs.cpp",
"line_number" => 724,
"log_message" => "T1T: Power request disabled for this cable. Defaulting to 1000mA"
}
{
"message" => "Apr 23 21:34:10 LogPortSysLog: T:2015-04-23T21:34:10.570 N:933087 S:Warn P:DasInterfaceThread#791 F:USBStrategyBaseAbs.cpp:1696 D:CP_CONTROL:Unexpected DasChildTag: 27 B:{}",
"@version" => "1",
"@timestamp" => "2015-04-24T01:34:10.570Z",
"host" => "Chriss-MBP-2",
"name" => "LogPortSysLog",
"log_number" => 933087,
"severity" => "Warn",
"thread" => "DasInterfaceThread#791",
"filename" => "USBStrategyBaseAbs.cpp",
"line_number" => 1696,
"log_message" => "CP_CONTROL:Unexpected DasChildTag: 27 B:{}"
}
如果您正确配置输出,那么这些是发送到Elasticsearch的两个文档。 Grok消息只是正则表达式,所以你绝对可以创建一个专门解析(或不会!)log_message
内部部分的模式,其中包括忽略内容,如上面的B:{}
。要忽略它,请不要提供字段名称(例如,:log_message
命名匹配的模式log_message
,因此如果不命名,则会被忽略)。
从那里开始,只需要加载Kibana and created a Visualization。它将自动使用上面的字段使其可搜索。例如,您可以搜索severity:warn
以仅查看严重性为"警告"的日志行。 (不区分大小写)。要查找完全匹配项,您可以使用自动添加的severity.raw
来搜索severity.raw:Warn
,但这通常不是用户所做的。