ELK Stack,logstash.log大小每秒都在增加,但系统仍在运行

时间:2015-11-26 18:48:37

标签: elasticsearch amazon-ec2 dns logstash elastic-stack

我在AWS上运行了一个ELK Stack,整个系统如下,App服务器(目前4个)使用Logstash-Forwarder在每个EC2实例上将日志转发到Logstash-Shipper,Logstash-Shipper将它们发送到Kafka / Zookeeper,主题名称为“Elk-Stack”。 Logstash-Indexer从主题“Elk-Stack”接收日志,过滤该日志数据并输出到Elasticsearch。我在2个Elasticsearch EC2实例的集群上有一个Loadbalancer,而Kibana用于可视化该日志数据以进行一些分析。

我的Logstash-Indexer配置文件用于输入,过滤和输出如下。

01-登录inout.conf

input {
  kafka {
        zk_connect => "KAFKA-IP-HERE:2181"
        topic_id => "ELK-Stack"
  }
}

20 sys_app1.conf

filter {
      grok {
            match => {
              "message" => "%{IP:Host} \(%{IP:ClientIP}, %{IP:HostIP}\) - - \[%{HTTPDATE:timestamp}\] \"%{GREEDYDATA:MessageNotParsed}\""
            }
        }

        date {
            locale => "en"
            timezone => "UTC"
            match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
            target => "@timestamp"
        }
        mutate {
            add_field => { "debug" => "timestampMatched"}
        }
     geoip {
        source => "ClientIP"
        target => "geoip"
        database => "/etc/logstash/GeoLiteCity.dat"
        add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
        add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
    }
    mutate {
        convert => [ "[geoip][coordinates]", "float"]
    }

}

99输出到es.conf

output { 
    elasticsearch { 
           host => "DNS-NameOfLoadBalancer-eb.amazonaws.com"
           protocol => http 
    }
}

当我检查Logstash-Indexer日志文件时,我看到3个不同的logstash ".log", ".stdout" and ".err"。 .err和.stdout似乎没有改变,但logstash.log正在增加第二个。我跑了ls -ltr /var/log/logstash/

enter image description here

当我查看这个文件时,我看到很多数据,但不确定它的含义,是否有人知道为什么会这样。这是我尝试获得的一些输出,因为它似乎重复了一次。

{:timestamp=>"2015-11-26T18:08:49.748000+0000", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>"Manticore::ClientProto
colException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:35:in `initialize'", "org/jr
uby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:70:in `call'", "/opt/l
ogstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:245:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gem
s/manticore-0.4.4-java/lib/manticore/response.rb:148:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/ela
sticsearch/transport/transport/http/manticore.rb:71:in `perform_request'", "org/jruby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jr
uby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/base.rb:191:in `perform_request'", "/opt/logstash/vendor/bundle/j
ruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendo
r/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/client.rb:119:in `perform_request'", "/opt/logstash/vendor/bundl
e/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:87:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash
-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch/protocol.rb:104:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstas
h-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:542:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-outp
ut-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:541:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-ela
sticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:566:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsea
rch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:565:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:21
9:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:216:in `
buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:159:in `buffer_receive'", "/opt/logstash/vendor/bundle/j
ruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:531:in `receive'", "/opt/logstash/vendor/bundle/jruby/
1.9/gems/logstash-core-1.5.5-java/lib/logstash/outputs/base.rb:88:in `handle'", "(eval):22:in `output_func'", "/opt/logstash/vendor/bundle/jruby/1.
9/gems/logstash-core-1.5.5-java/lib/logstash/pipeline.rb:244:in `outputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.5-ja
va/lib/logstash/pipeline.rb:166:in `start_outputs'"], :level=>:warn}
{:timestamp=>"2015-11-26T18:08:50.283000+0000", :message=>["INFLIGHT_EVENTS_REPORT", "2015-11-26T18:08:50Z", {"input_to_filter"=>1, "filter_to_outp
ut"=>1, "outputs"=>[]}], :level=>:warn}
{:timestamp=>"2015-11-26T18:08:50.478000+0000", :message=>["INFLIGHT_EVENTS_REPORT", "2015-11-26T18:08:50Z", {"input_to_filter"=>20, "filter_to_out
put"=>20, "outputs"=>[]}], :level=>:warn}
{:timestamp=>"2015-11-26T18:08:50.753000+0000", :message=>"Got error to send bulk of actions: The server failed to respond with a valid HTTP respon
se", :level=>:error}

{:timestamp=>"2015-11-26T18:08:50.753000+0000", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>"Manticore::ClientProto
    colException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:35:in `initialize'", "org/jr
    uby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:70:in `call'", "/opt/l
    ogstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:245:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gem
    s/manticore-0.4.4-java/lib/manticore/response.rb:148:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/ela
    sticsearch/transport/transport/http/manticore.rb:71:in `perform_request'", "org/jruby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jr
    uby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/base.rb:191:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/client.rb:119:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:87:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch/protocol.rb:104:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:542:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:541:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:566:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:565:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:159:in `buffer_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:531:in `receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.5-java/lib/logstash/outputs/base.rb:88:in `handle'", "(eval):22:in `output_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.5-java/lib/logstash/pipeline.rb:244:in `outputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.5-java/lib/logstash/pipeline.rb:166:in `start_outputs'"], :level=>:warn}
    {:timestamp=>"2015-11-26T18:08:51.758000+0000", :message=>"Got error to send bulk of actions: The server failed to respond with a valid HTTP response", :level=>:error}

为什么会这样?这是什么意思?我在这里很困惑,因为我可以使用负载均衡器(DNS名称)"http://DNS-NameOfLoadBalancer-eb.amazonaws.com:9200/_plugin/head的URL使用(Head插件)在Elasticsearch中查看日志数据,我可以在发现选项卡中查看数据并在Kibana中对其进行询问。

这是我系统的直观表示。

enter image description here

非常感谢你帮助我解决这个谜团。

0 个答案:

没有答案