我在logstash.conf中有以下配置, 使用以下命令“
启动了我的logstashlogstash --verbose -f D:\ELK\logstash-5.6.0\logstash-5.6.0\logstash.conf`
并且弹性搜索在9200端口运行,但是logstash没有将解析后的日志文件内容流水线化为弹性搜索。我错过了任何配置吗?或者我在这里做错了什么。
input{
file{
path => "D:/server.log" start_position=> "beginning" type => "logs"
}
}
filter{
grok{
match => {'message'=>'\[%{TIMESTAMP_ISO8601:logtime}\]%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}\[(?<threadname>[^\]]+)\]%{SPACE}%{WORD}\:%{WORD}\:%{WORD}%{SPACE}\(%{WORD:className}\.%{WORD}\:%{WORD}\)%{SPACE}\-%{SPACE}%{GREEDYDATA:errorDescription}'
'message1'=>'\[%{TIMESTAMP_ISO8601:logtime}\]%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}\[(?<threadname>[^\]]+)\]%{SPACE}%{WORD}\:%{WORD}\:%{WORD}:%{WORD}%{SPACE}\(%{WORD:className}\.%{WORD}\:%{WORD}\)%{SPACE}\-%{SPACE}%{GREEDYDATA:errorDescription}'
'message2'=>'\[%{TIMESTAMP_ISO8601:logtime}\]%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}\[(?<threadname>[^\]]+)\]%{SPACE}\(%{WORD:className}\.%{WORD}\:%{WORD}\)%{SPACE}\-%{SPACE}%{GREEDYDATA:errorDescription}'
}
add_field => {
'eventName' => 'grok'
}
}
}
output{
elasticsearch{
hosts=>["localhost:9200"]
index=>"tuesday"
}
}
这是我的示例日志内容:
[2018-02-12 05:25:22,996] ERROR [VBH-1] (ClassA.java:55) - Could not process a new task
[2018-02-13 08:02:24,690] ERROR [CTY-2] C:31:cvbb09:0x73636711c67k4g2e (ClassB.java:159) - Calling command G Update on server http://localhost/TriggerDXFGeneration?null failed because server responded with http status 400 response was: ?<?xml version="1.0" encoding="utf-8"?>
[2018-02-13 08:02:24,690] DEBUG [BHU-2] C:31:cvbb09:0x73636711c67k4g2e (ClassC.java:836) - insertDxfProcessingQueue() called with ConfigID : FTCC08_0X5A3A7E222DD2171B
[2018-02-13 08:07:51,087] ERROR [http-apr-50101-exec-2] C:10:cvbb09 (ClassD.java:133) - Exception on TestScheduler():
无法解析日志内容。
{
"path" => "D://ELK/server.log",
"@timestamp" => 2018-02-19T16:01:12.083Z,
"@version" => "1",
"host" => "AAEINBLR05971L",
"message" => "[2018-02-13 08:02:24,690] DEBUG [BHU-2] C:31:cvbb09:0x73636711c67k4g2e (ClassC.java:836) - insertDxfProcessingQueue() called with ConfigID : FTCC08_0X5A3A7E222DD2171B\r",
"type" => "logs",
"tags" => [
[0] "_grokparsefailure"
]
}