Logstash多个日志

时间:2017-07-22 17:12:12

标签: elasticsearch logstash logstash-configuration

我正在关注在线教程,并提供了cars.csv文件和以下Logstash配置文件。我的logstash运行得非常好,并且正如我们所说的那样索引CSV。

问题是,我有另一个日志文件(完全不同的数据),我需要解析并索引到不同的索引。

  1. 如何在不重新启动logstash的情况下添加此配置?
  2. 如果以上情况不可能,我编辑配置文件然后重新启动logstash - 它不会重新索引整个汽车文件吗?
  3. 如果我这样做2.如何为多种样式的日志文件格式化配置。
  4. 例如。我的新日志文件如下所示:

    01-01-2017 ORDER FAILED: £12.11 Somewhere : Fraud
    

    现有配置文件:

        input {
        file {
               path => "/opt/cars.csv"
               start_position => "beginning" 
               sincedb_path => "/dev/null"
             }
         }
    
        filter {
           csv {
              separator => ","
              columns =>
              [
                    "maker",
                    "model",
                        "mileage",
                            "manufacture_year",
                                "engine_displacement",
                                "engine_power",
                                "body_type",
                        "color_slug",
                    "stk_year",
                "transmission",
                        "door_count",
                "seat_count",
                "fuel_type",
                "date_last_seen",
                "date_created",
                "price_eur"
             ]
        }
    
        mutate {
           convert => ["mileage", "integer"]
        }
        mutate {
           convert => ["price_eur", "float"]
        }
        mutate {
               convert => ["engine_power", "integer"]
        }
        mutate {
          convert => ["door_count", "integer"]
        }
        mutate {
               convert => ["seat_count", "integer"]
        }
        }
    
         output {
          elasticsearch {
           hosts => "localhost"
    
           index => "cars"
           document_type => "sold_cars"
         }
    
         stdout {}
        }
    

    orders.log的配置文件

        input {
          file {
                path => "/opt/logs/orders.log"
                start_position => "beginning"
                sincedb_path => "/dev/null"
            }
        }
    
       filter {
           grok {
              match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z]+)( : (?<order_failure_reason>[A-Za-z ]+))?"}
              }
    
             mutate {
                  convert => ["order_amount", "float"]
                }
              }
    
           output {
           elasticsearch {
                   hosts => "localhost"
    
                   index => "sales"
                   document_type => "order"
         }
    
        stdout {}
         }
    

    免责声明:我是一个完整的新手。第二天使用ELK。

2 个答案:

答案 0 :(得分:0)

对于第1点,您可以在logstash.yml文件中设置

config.reload.automatic:true

或者,在使用conf文件执行logstash时,请运行它:

bin/logstash -f conf-file-name.conf --config.reload.automatic

完成其中任何一项设置后,您就可以开始使用logstash,从现在开始,您在conf文件中所做的任何更改都会被反射回来。

答案 1 :(得分:0)

2. If above isn't possible and I edit the config file then restart logstash - it won't reindex the entire cars file will it?

如果您使用sincedb_path => "/dev/null",Logstash将无法记住停止读取文档的位置,并会在每次重新启动时重新编制索引。如果您希望记住Logstash,则必须删除此行(请参阅here)。

3.How do I format the config for multiple styles of log file.

要支持多种样式的日志文件,您可以在文件输入上添加标记(请参阅https://www.elastic.co/guide/en/logstash/5.5/plugins-inputs-file.html#plugins-inputs-file-tags),然后在文件配置中使用条件(请参阅https://www.elastic.co/guide/en/logstash/5.5/event-dependent-configuration.html#conditionals)。

像这样:

file {
    path => "/opt/cars.csv"
    start_position => "beginning" 
    sincedb_path => "/dev/null"
    tags => [ "csv" ]
}


file {
    path => "/opt/logs/orders.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    tags => [] "log" ]
}


if csv in [tags] {
    ...
} else if log in [tags] {
    ...
}