使用不同的过滤器和编解码器将多个kafka主题输入到logstash

时间:2019-04-22 12:57:39

标签: elasticsearch apache-kafka logstash

我正在用kafka设置一只麋鹿,想通过2个kafka主题(Windowslog的topic1和wazuh日志的topic2)发送日志到具有不同编解码器和过滤器的logstash。我尝试使用loglow的波纹管输入配置,但不是

input {
  kafka {
    bootstrap_servers => "kafka:9000"
    topics => ["windowslog", "system02"]
    decorate_events => true
    codec => "json"
    auto_offset_reset => "earliest"
  }
  kafka {
    bootstrap_servers => "kafka-broker:9000"
    topics => ["wazuh-alerts"]
    decorate_events => true
    codec => "json_lines"
  }
}

和filter.conf文件:

filter {
  if [@metadata][kafka][topic] == "wazuh-alerts" {
    if [data][srcip] {
      mutate {
        add_field => [ "@src_ip", "%{[data][srcip]}" ]
    }
    }
    if [data][aws][sourceIPAddress] {
      mutate {
        add_field => [ "@src_ip", "%{[data][aws][sourceIPAddress]}" ]
      }
    }
    geoip {
      source => "@src_ip"
      target => "GeoLocation"
      fields => ["city_name", "country_name", "region_name", "location"]
    }
    date {
      match => ["timestamp", "ISO8601"]
      target => "@timestamp"
    }
    mutate {
      remove_field => [ "timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]
    }
  }
}

我该怎么做?

1 个答案:

答案 0 :(得分:2)

尝试在每个input上使用标签,并根据这些标签进行过滤。

例如:

input {
  kafka {
    bootstrap_servers => "kafka-broker:9000"
    topics => ["wazuh-alerts"]
    decorate_events => true
    codec => "json_lines"
    tags => ["wazuh-alerts"]
  }
}

在您的过滤器和输出中,您需要基于该标记的条件。

filter {
    if "wazuh-alerts" in [tags] {
        your filters
    }
}
output {
    if "wazuh-alerts" in [tags] {
        your output 
    }
}