我正在尝试通过logstash将我的日志文件推送到elasticsearch并在kibana上显示它。它适用于单行日志记录。但是,它涉及多线过滤器时失败。
这是我的示例多行日志输入:
2016-06-02T04:02:29,720 INFO Thread-25-match-entity-bolt a52488cc-316b-402e-af58-3b8a663cd76a STDIO invoke Error processing message:{
"eid": "f9f16541-4fab-4131-a82e-e3ddf6fcd949",
"entityInfo": {
"entityType": "style",
"defaultLocale": "en-US"
},
"systemInfo": {
"tenantId": "t1"
},
"attributesInfo": {
"externalId": 1514,
"attributesRead": {
"IsEntityVariantsValid": false,
"IsEntityExtensionsValid": false
},
"attributesUpdated": {
"DateAttribute": "2016-06-01T00:00:00.0000000",
"IsEntitySelfValid": true,
"IsEntityMetaDataValid": true,
"IsEntityCommonAttributesValid": true,
"IsEntityCategoryAttributesValid": true,
"IsEntityRelationshipsValid": true
}
},
"jsAttributesInfo": {
"jsRelationship": {
"entityId": "CottonMaterial001",
"parentEntityId": "Apparel",
"category": "Apparel",
"categoryName": "Apparel",
"categoryPath": "Apparel",
"categoryNamePath": "Apparel",
"variant": "1514",
"variantPath": "1035/1514",
"container": "Demo Master",
"containerName": "Demo Master",
"containerPath": "DemoOrg/Demo Master/Apparel",
"organization": "DemoOrg",
"segment": "A"
},
"jsChangeContext": {
"entityAction": "update",
"user": "cfadmin",
"changeAgent": "EntityEditor.aspx",
"changeAgentType": "PIM",
"changeInterface": "Entity",
"sourceTimestamp": "2016-06-01T19:48:19.4162475+05:30",
"ingestTimestamp": "2016-06-01T19:48:19.4162475+05:30"
}
}
}
到目前为止,我已尝试过这些logstash配置:
input {
file {
path => "path_to_logs/logs.log"
start_position => "beginning"
}
}
filter{
multiline {
negate => "true"
pattern => "^%{TIMESTAMP_ISO8601} "
what => "previous"
}
grok{
match => { "message" => "^%{TIMESTAMP_ISO8601:JigsawTimestamp}%{SPACE}%{LOGLEVEL:JigsawLoglevel}%{SPACE}%{HOSTNAME:ThreadName}%{SPACE}%{UUID:GUID}%{SPACE}%{JAVACLASS:JigsawClassName}%{SPACE}%{WORD:JigsawMethodName}%{SPACE}%{GREEDYDATA:JigsawLogMessage}" }
}
}
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
}
}
}
第二个:
input {
file {
path => "path_to_logs/logs.log"
start_position => "beginning"
codec => multiline {
negate => "true"
pattern => "^%{TIMESTAMP_ISO8601} "
what => "previous"
}
}
}
filter{
grok{
match => { "message" => "^%{TIMESTAMP_ISO8601:JigsawTimestamp}%{SPACE}%{LOGLEVEL:JigsawLoglevel}%{SPACE}%{HOSTNAME:ThreadName}%{SPACE}%{UUID:GUID}%{SPACE}%{JAVACLASS:JigsawClassName}%{SPACE}%{WORD:JigsawMethodName}%{SPACE}%{GREEDYDATA:JigsawLogMessage}" }
}
}
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
}
}
}
我也尝试过这种模式:
pattern => "^\s"
然而,这一切都没有帮助。他们都得到了_grokparsefailure标签。我希望JSON行成为单个消息的一部分。请指出此过滤器中的错误。
答案 0 :(得分:0)
在你的grok过滤器中,有几个错误,你无法看到任何日志。
为什么JAVACLASS错了?
它的实现如下: -
JAVACLASS(?:[a-zA-Z0-9 - ] +。)+ [A-Za-z0-9 $] +
根据上述JAVACLASS要求至少一个句点(。)符号出现在文本中。但是,在您的日志中,它只是 STDIO 。
用以下内容替换你的grok比赛: -
match => { "message" => "^%{TIMESTAMP_ISO8601:JigsawTimestamp}%{SPACE}%{LOGLEVEL:JigsawLoglevel}%{SPACE}%{SPACE}%{HOSTNAME:ThreadName}%{SPACE}%{UUID:GUID}%{SPACE}%{WORD:JigsawClassName}%{SPACE}%{WORD:JigsawMethodName}%{SPACE}%{GREEDYDATA:JigsawLogMessage}" }
另外为了便于理解,使用输出通过添加stdout插件将其重定向到控制台,如下所示: -
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
使用Logstash处理数据时,您可以更轻松地理解错误。