您好,我正在尝试使用Filebeat和Logstash设置日志分析。 以下是我在
中所做的更改filebeat.inputs:
- type: log
enabled: true
paths:
- D:\elasticsearch-5.4.3\elasticsearch-5.4.3\logs\elasticsearch.log
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
这是我的logstash配置文件。
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{plugins}" }
}
date {
match => [ "timestamp" , "yyyy-MM-DD:HH:mm:ss" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
在执行上述操作时,我看到以下错误:
[2019-10-22T06:07:32,915][ERROR][logstash.javapipeline ] Pipeline aborted due
to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{plu
gins} not defined>, :backtrace=>["D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle
/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "
org/jruby/RubyKernel.java:1425:in `loop'", "D:/logstash-7.1.0/logstash-7.1.0/ven
dor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "
D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-
grok-4.0.4/lib/logstash/filters/grok.rb:281:in `block in register'", "org/jruby/
RubyArray.java:1792:in `each'", "D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/
jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:275:in
`block in register'", "org/jruby/RubyHash.java:1419:in `each'", "D:/logstash-7.1
.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/
logstash/filters/grok.rb:270:in `register'", "org/logstash/config/ir/compiler/Ab
stractFilterDelegatorExt.java:56:in `register'", "D:/logstash-7.1.0/logstash-7.1
.0/logstash-core/lib/logstash/java_pipeline.rb:191:in `block in register_plugins
'", "org/jruby/RubyArray.java:1792:in `each'", "D:/logstash-7.1.0/logstash-7.1.0
/logstash-core/lib/logstash/java_pipeline.rb:190:in `register_plugins'", "D:/log
stash-7.1.0/logstash-7.1.0/logstash-core/lib/logstash/java_pipeline.rb:446:in `m
aybe_setup_out_plugins'", "D:/logstash-7.1.0/logstash-7.1.0/logstash-core/lib/lo
gstash/java_pipeline.rb:203:in `start_workers'", "D:/logstash-7.1.0/logstash-7.1
.0/logstash-core/lib/logstash/java_pipeline.rb:145:in `run'", "D:/logstash-7.1.0
/logstash-7.1.0/logstash-core/lib/logstash/java_pipeline.rb:104:in `block in sta
rt'"], :thread=>"#<Thread:0x15997940 run>"}
[2019-10-22T06:07:32,970][ERROR][logstash.agent ] Failed to execute ac
tion {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message
=>"Could not execute action: PipelineAction::Create<main>, action_result: false"
, :backtrace=>nil}
我对这种整合还很陌生,不确定我应该研究什么。 请帮助我。
答案 0 :(得分:2)
问题似乎出在
grok {
match => { "message" => "%{plugins}" }
}
这里的%{plugins}
是什么?它不是预定义的grok
模式。可以在here中找到grok
模式的列表。
此外,来自documentation的grok
模式的语法为%{SYNTAX:SEMANTIC}
。你可以做类似的事情
grok {
match => { "message", "%{GREEDYDATA:plugins}" }
}
答案 1 :(得分:0)
尝试提供"%{plugins}"
的数据类型。
filter {
grok {
match => { "message" => "%{WORD:plugins}" }
}
}
您可以从here
中找到数据类型如果这不起作用,请尝试删除日期过滤器,然后重试。
答案 2 :(得分:0)
显然,由于某些regexp语法错误深入配置文件,可能会发生此类错误。就是裂缝。