如何在logstash

时间:2019-03-27 10:37:49

标签: elasticsearch rubygems logstash filebeat

我正在本地使用Windows系统测试ELK堆栈的配置。
操作系统-Windows 10 x64 Java-

java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

ES-版本:6.4.0
LS-版本:6.4.2
fb-版本:6.4.2

我正在使用X_T-模式通过文件拍识别文件,将数据推送到logstash进行解析并注入ES。

我的JSON数据看起来像下面两个不同的日期:[03/01/2019]

[
{"ReadDate":"2019-03-01", "ReadTime":"09:52:40", "idUser": 1, "currentData": 2},
{"ReadDate":"2019-03-01", "ReadTime":"09:52:40", "idUser": 2, "currentData": 1},
{"ReadDate":"2019-03-01", "ReadTime":"09:52:40", "idUser": 3, "currentData": 0},
{"ReadDate":"2019-03-01", "ReadTime":"09:52:40", "idUser": 4, "currentData": 3},
{"ReadDate":"2019-03-01", "ReadTime":"09:52:40", "idUser": 5, "currentData": 4}
]


[03/02/2019]

[
{"ReadDate":"2019-03-02", "ReadTime":"09:52:40", "idUser": 1, "currentData": 3},
{"ReadDate":"2019-03-02", "ReadTime":"09:52:40", "idUser": 2, "currentData": 2},
{"ReadDate":"2019-03-02", "ReadTime":"09:52:40", "idUser": 3, "currentData": 1},
{"ReadDate":"2019-03-02", "ReadTime":"09:52:40", "idUser": 4, "currentData": 4},
{"ReadDate":"2019-03-02", "ReadTime":"09:52:40", "idUser": 5, "currentData": 5}
]

我想要实现的是,当检测到新条目时,Logstash应该查询旧条目(旧实例),并使用要从currentcurrentData中减去的旧currentData并将其作为usedData。
我在Logstash conf中的配置如下:

input {
  beats {
    port => 9600
  }
}

filter {
  if ([fields][log_type] == "wmeter_expon_test") {
    elasticsearch {
       hosts => ["http://127.0.0.1:9200"]
       index => "wmeter_expon_test"
       query_template => "template_test.json"
        fields => { 
                "currentData" => "PreviousExpon"
                "@timestamp" => "lastTimeStamp"
            }
    }
    ruby {
        code => 'event.set("UsedExpon", ((event.get("currentData").to_i)-(event.get("PreviousExpon").to_i)))'
    }  
    if [WMExpon] == "FF" {
        mutate {
          replace => {
            "[type]" => "wmeter_expon_test"
            "currentData" => "-1"
          }
          add_field => { 
            "Status" => 0
          }
          strip => ["ReadTime"]
        }
    } else {  
        mutate {
          replace => {
            "[type]" => "wmeter_expon_test"
          }
          add_field => { 
            "Status" => 0
          }
          strip => ["ReadTime"]
        }
    }
    fingerprint {
        source => ["ReadDate", "ReadTime", "idUser"]
        target => "[@metadata][fingerprint]"
        method => "MURMUR3"
        concatenate_sources => true
    }
  }
}

output {
    if [type] == "wmeter_weight" {
        elasticsearch {  
            hosts => ["http://127.0.0.1:9200"]
            index => "wmeter_info"
            document_type => "_doc"
            action => "create"
            document_id => "%{[@metadata][fingerprint1]}"
        }
    }
    elasticsearch {
      hosts => ["http://127.0.0.1:9200"]
      index => "%{type}"
      document_type => "_doc"
      document_id => "%{[@metadata][fingerprint]}"
    }
    if [type] == "wmeter_expon" {
        elasticsearch {  
            hosts => ["http://127.0.0.1:9200"]
            index => "wmeter_weight"
            document_type => "_doc"
            action => "create"
            document_id => "%{[@metadata][fingerprint1]}"
        }
    }
    stdout { codec => rubydebug }
}


并且template_test.json看起来如下

{
    "query": {
            "query_string": {
                "query": "idUser:%{[idUser]} AND (currentData:>='0' OR currentData:>=0)"
            }
        },
    "size": 1, 
    "sort": [{"@timestamp": "desc"}]
}
  • 发生了什么
    发生新条目时,由于某种原因它无法查询ES以获取先前的实例进行减法运算,因此将0用作先前的currentData。结果,所有usedData都显示为与currentData相同。
  • Logstash错误日志显示在
Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:1 AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0x76830963>, :error=>#<Faraday::ConnectionFailed>}
[2019-03-27T15:08:01,820][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:%{[idUser]} AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0x328e7a3a>, :error=>#<Faraday::ConnectionFailed>}
[2019-03-27T15:08:01,816][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:2 AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0x5b0f6efd>, :error=>#<Faraday::ConnectionFailed>}
[2019-03-27T15:08:02,102][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:3 AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0x66847583>, :error=>#<Faraday::ConnectionFailed>}
[2019-03-27T15:08:02,215][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:4 AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0xa639f6e>, :error=>#<Faraday::ConnectionFailed>}
[2019-03-27T15:08:02,287][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"wmeter_expon_test", :query=>{"query"=>{"query_string"=>{"query"=>"idUser:5 AND (currentData:>='0' OR currentData:>=0)"}}, "size"=>1, "sort"=>[{"@timestamp"=>"desc"}]}, :event=>#<LogStash::Event:0x7d9ed706>, :error=>#<Faraday::ConnectionFailed>}

有人可以帮忙吗?

  • 已检查ES是否在127.0.0.1:9200上正常运行
  • 数据注入正常工作,但事实是它为usedData使用了错误的数据

  • 基本上,Logstash应该查询先前的实例,并从当前currentData中减去旧的currentData,然后将其放入usedData字段。

  • 该功能以前有时可以使用,但最近已停止。在生产和/或本地不再可用。
  • 参考文献https://www.elastic.co/guide/en/logstash/current/plugins-filters-elasticsearch.html

0 个答案:

没有答案