我已经在logstash中写了一个conf文件来读取jms日志,问题是我无法将记录分解为换行前的新行。这是原始数据
####<Sep 20, 2015 12:00:12 AM> <> <1442678412960> <809000> <ID:<307061.1442678412716.0>> <> <CmsCorpAlsPrd_als_mod!CmsCorpAlsPrd_jmsAls_cdceap7e_32040@abb_audit_als_dQue> <Consumed> <<anonymous>> <MC:CA(local):OAMI(CmsCorpAlsPrd_cdceap7e_32040.jms.connection36.session121.consumer125)> <<?xml version="1.0" encoding="UTF-8"?>
<mes:WLJMSMessage xmlns:mes="http://www.bea.com/WLS/JMS/Message"><mes:Header><mes:JMSDeliveryMode>PERSISTENT</mes:JMSDeliveryMode><mes:JMSExpiration>0<> <> ####<Sep 20, 2015 12:00:13 AM> <> <1442678413018> <392000> <ID:<307061.1442678412943.0>> <> <CmsCorpAlsPrd_als_mod!CmsCorpAlsPrd_jmsAls_cdceap7e_32040@abb_audit_als_dQue> <Produced> <<anonymous>> <> <<?xml version="1.0" encoding="UTF-8"?>
<mes:WLJMSMessage xmlns:mes="http://www.bea.com/WLS/JMS/Message"><mes:Header><mes:JMSDeliveryMode>PERSISTENT</mes:JMSDeliveryMode><mes:JMSExpiration>0<> <>
这是我在logstash中的conf文件
input{
stdin{}
file{
type => "txt"
path => "C:\HA\jms\jms.log"
start_position => "beginning"
}
}
filter{
multiline{
pattern => "\&"
what => previous
}
grok{
match => {"message" => ['####<%{GREEDYDATA:Date}>%{SPACE}<>%{SPACE}<%{GREEDYDATA:Millisec_Date}>%{SPACE}<%{GREEDYDATA:Nanosec_Date}>%{SPACE}<ID:<%{GREEDYDATA:JMS_message_ID}>>%{SPACE}<>%{SPACE}<%{GREEDYDATA:JMS_destination_name}>%{SPACE}<%{GREEDYDATA:JMS_message_eventname}>%{SPACE}<<%{GREEDYDATA:JMS_username}>>%{SPACE}<%{GREEDYDATA:JMS_correlationID}>%{SPACE}<%{GREEDYDATA:Mcls}:JMSDeliveryMode>%{WORD:JMSDeliveryMode}</mes:JMSDeliveryMode><mes:JMSExpiration>%{NUMBER:JMSExpiration}<>%{SPACE}<>']}
}
}
output{
elasticsearch { hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
所有的事情进展顺利,除非我运行conf并且结果给了我这个
"@version" => "1",
"@timestamp" => "2016-06-08T06:23:50.543Z",
"path" => "C:\\HA\\jms\\jms.log",
"host" => "WIN-07LLQEN2SJB",
"type" => "txt",
"tags" => [
[0] "multiline"
],
"Date" => "Sep 20, 2015 12:00:12 AM> <> <1442678412960> <809000> <ID:<307061.1
442678412716.0>> <> <CmsCorpAlsPrd_als_mod!CmsCorpAlsPrd_jmsAls_cdceap7e_32040@abb_audit_als_dQue> <Consumed> <<anonymou
s>> <MC:CA(local):OAMI(CmsCorpAlsPrd_cdceap7e_32040.jms.connection36.session121.consumer125)> <<?xml version=\"1.0\"
encoding=\"UTF-8\"?>\n<mes:WLJMSMessage xmlns:mes=\"http://www.bea.com/WLS/JMS/Message\"><mes:Header><
mes:JMSDeliveryMode>PERSISTENT</mes:JMSDeliveryMode><mes:JMSExpiration>0<> <> \n####<
Sep 20, 2015 12:00:13 AM",
"Millisec_Date" => "1442678413018",
"Nanosec_Date" => "392000",
"JMS_message_ID" => "307061.1442678412943.0",
"JMS_destination_name" => "CmsCorpAlsPrd_als_mod!CmsCorpAlsPrd_jmsAls_cdceap7e_32040@abb_audit_als_dQue",
"JMS_message_eventname" => "Produced",
"JMS_username" => "anonymous",
"Mcls" => "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<mes:WLJMSMessage xmlns:mes=\"http:
//www.bea.com/WLS/JMS/Message\"><mes:Header><mes",
"JMSDeliveryMode" => "PERSISTENT",
"JMSExpiration" => "0"
}
显然,日期部分已经读取了第一条消息中的所有数据,并且似乎将i分类为第二条消息的数据。无论如何要在新行中打破不同的记录来解决这个问题吗?
答案 0 :(得分:0)
过了一段时间,我配置的两件事情有帮助: