我有一些预定的批处理作业在夜间自动运行,并会在单独的.txt文件中显示结果。 无论如何都要从.txt文件中获取数据并合并它们以显示在kibana中?
以下是.txt文件(Windows)的示例
Job Code.txt
job_id:0001,description:Ship data from server to elknode1
job_id:0002,description:Ship data from server to elknode2
job_id:0003,description:Ship data from server to elknode3
job_id:0004,description:Ship data from server to elknode4
Job Status.txt
job_id:0001,result:OK
job_id:0002,result:Error: Msg...
job_id:0003,result:OK
job_id:0004,result:OK
这是我非常基本的logstash.conf文件的过滤器prt。
filter{
grok{ match => {"message" => ["JobID: %{NOTSPACE:job_id}","description: %{NOTSPACE:description}","result: %{NOTSPACE:message}"]}
add_field => {
"JobID" => "%{job_id}"
"Description" => "%{description}"
"Message" => "%{message}"
}
}
if [job_id] == "0001" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "create"
}
}
if [job_id] == "0003" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
}
}
if [job_id] == "0002" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
}
}
if [job_id] == "0004" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
end_of_task => true
timeout => 120
}
}
}
感谢。