我的Kibana5.6.8 logstash配置似乎只读取一个日志文件 我在/ home / elastichsearch / confLogs上的logstash.conf是
input {
file {
type => "static"
path => "/home/elasticsearch/static_logs/**/*Web.log*" exclude => "*.zip"
start_position => beginning
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "static" {
if [message] !~ /(.+)/ {
drop { }
}
grok{
patterns_dir => "./patterns"
overwrite => [ "message" ]
# 2017-08-07 11:47:35,466 INFO [http-bio-10.60.2.19-10267-exec-60] jsch.DeployManagerFileUSImpl (DeployManagerFileUSImpl.java:155) - Deconnexion de l'hote qvizzza3
# 2017-08-07 11:47:51,775 ERROR [http-bio-10.60.2.19-10267-exec-54] service.BindingsRSImpl (BindingsRSImpl.java:143) - Can't find bindings file deployed on server
# 2017-08-03 16:01:11,352 WARN [Thread-552] pcf2.AbstractObjetMQDAO (AbstractObjetMQDAO.java:137) - Descripteur de
match => [ "message", "%{TIMESTAMP_ISO8601:logdate},%{INT} %{LOGLEVEL:logLevel} \[(?<threadname>[^\]]+)\] %{JAVACLASS:package} \(%{JAVAFILE:className}:%{INT:line}\) - %{GREEDYDATA:message}" ]
}
# 2017-08-03 16:01:11,352
date{
match => [ "logdate", "YYYY-MM-dd hh:mm:ss" ]
target => "logdate"
}
}
}
output {
elasticsearch { hosts => ["192.168.99.100:9200"]}
我的日志目录,带有负载均衡的logrotate文件
static_logs
--prd1
----mlog Web.log
----mlog Web.log.1
----mlog Web.log.2
--prd2
----mlog Web.log
----mlog Web.log.2
我的错误在哪里? 我的模式是/ home / elasticsearch / confLogs / patterns / grok-patterns qui with TIMESTAMP_ISO8601
此致
答案 0 :(得分:0)
如果我的日志文件超过140M,则logdate过滤器不会被视为日期字段,而是作为字符串字段查看!!!