ELK用于Windows日志处理

时间:2015-04-10 08:16:21

标签: elasticsearch logstash event-log kibana

我在Debian Wheezy上创建了一个工作的ELK堆栈,并设置了Nxlog来收集Windows日志。我看到Kibana中的日志 - 一切正常,但我得到的数据太多,想要通过删除一些我不需要的字段来过滤它。

我制作了一个过滤器部分,但它根本不起作用。可能是什么原因? 上面的过滤器

input {
tcp {
    type   => "eventlog"
    port   => 3515
    format => "json"
        }
}
filter {
    type => "eventlog"
   mutate {
           remove => { "Hostname",  "Keywords", "SeverityValue", "Severity", "SourceName", "ProviderGuid" }
           remove => { "Version", "Task", "OpcodeValue", "RecordNumber", "ProcessID", "ThreadID", "Channel" }
           remove => { "Category", "Opcode", "SubjectUserSid", "SubjectUserName",  "SubjectDomainName" }
           remove => { "SubjectLogonId", "ObjectType", "IpPort", "AccessMask", "AccessList", "AccessReason" }
           remove => { "EventReceivedTime", "SourceModuleName", "SourceModuleType", "@version", "type" }
           remove => { "_index", "_type", "_id", "_score", "_source", "KeyLength", "TargetUserSid" }
           remove => { "TargetDomainName", "TargetLogonId", "LogonType", "LogonProcessName", "AuthenticationPackageName" }
           remove => { "LogonGuid", "TransmittedServices", "LmPackageName", "ProcessName", "ImpersonationLevel" }
           }
    } 
output {
elasticsearch {
    cluster => "wisp"
    node_name => "io"
    }
}

1 个答案:

答案 0 :(得分:0)

我认为您尝试删除某些日志中不存在的字段。 您的所有日志都包含您尝试删除的所有内容吗? 如果没有,则必须在删除字段之前识别日志。 您的过滤器配置如下所示:

filter {
    type => "eventlog"
    if [somefield] == "somevalue" {
        mutate {
            remove => { "specificfieldtoremove1", "specificfieldtoremove2" }
        }
    }
}