My logstash configuration is giving me this error:
每当我运行此命令时:/ opt / logstash / bin / logstash -f /etc/logstash/conf.d/logstash.conf --auto-reload --debug
reason=>"Expected one of #, {, ,, ] at line 27, column 95 (byte 677) after filter {\n\n\tif [type] == \"s3\" {\n\t\tgrok {\n\t\n \t\t\tmatch => [\"message\", \"%{IP:client} %{USERNAME} %{USERNAME} \\[%{HTTPDATE:timestamp}\\] (?:\"", :level=>:error, :file=>"logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}
这与我的模式有关。但是当我在Grok在线调试器中检查它时,它给了我所需的答案。请帮助。
Here is my logstash configuration:
input {
s3 {
access_key_id => ""
bucket => ""
region => ""
secret_access_key => ""
prefix => "access"
type => "s3"
add_field => { source => gzfiles }
sincedb_path => "/dev/null"
#path => "/home/shubham/logstash.json"
#temporary_directory => "/home/shubham/S3_temp/"
backup_add_prefix => "logstash-backup"
backup_to_bucket => "logstash-nginx-overcart"
}
}
filter {
if [type] == "s3" {
grok {
match => ["message", "%{IP:client} %{USERNAME} %{USERNAME} \[%{HTTPDATE:timestamp}\] (?:"%{WORD:request}
%{URIPATHPARAM:path} HTTP/%{NUMBER:version}" %{NUMBER:reponse} %{NUMBER:bytes} "%{USERNAME}" %{GREEDYDATA:responseMessage})"]
}
}
}
output {
elasticsearch {
hosts => ''
index => "accesslogs"
}
}
答案 0 :(得分:0)
你的匹配分配中有几个未转义的“字符”(例如,围绕用户名var),这会使解析器绊倒。如果你逃脱那些带有\的它应该可以工作。