Logstash :: message =>"无法刷新传出的项目"

时间:2015-12-01 04:14:59

标签: elasticsearch logstash

我已经建立了一个3节点ES实例(elasticsearch-1.4.2)。当从2个节点运行logstash时,它可以正常工作。但是,在第3个节点上运行logstash时会抛出如下所示的异常。你能帮忙吗?

{:timestamp=>"2015-11-30T06:50:58.873000-0700", 
 :message=>"Failed to flush outgoing items", 
 :outgoing_count=>1, 
 :exception=>#<Errno::EBADF: Bad file descriptor - Bad file descriptor>, 
 :backtrace=>[
    "org/jruby/RubyIO.java:2097:in `close'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/connection.rb:173:in `connect'",
    "org/jruby/RubyArray.java:1613:in `each'",
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/connection.rb:139:in `connect'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/agent.rb:406:in `connect'",
    "org/jruby/RubyProc.java:271:in `call'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/pool.rb:48:in `fetch'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/agent.rb:403:in `connect'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/agent.rb:319:in `execute'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/ftw-0.0.39/lib/ftw/agent.rb:217:in `post!'", 
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/outputs/elasticsearch/protocol.rb:106:in `bulk_ftw'", 
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/outputs/elasticsearch/protocol.rb:80:in `bulk'", 
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/outputs/elasticsearch.rb:315:in `flush'", 
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219:in `buffer_flush'", 
    "org/jruby/RubyHash.java:1339:in `each'",
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216:in `buffer_flush'",
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:193:in `buffer_flush'",
    "//scratch/LOGSTASH/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:159:in `buffer_receive'",
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/outputs/elasticsearch.rb:311:in `receive'", 
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/outputs/base.rb:86:in `handle'", 
    "(eval):148:in `initialize'", 
    "org/jruby/RubyProc.java:271:in `call'", 
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/pipeline.rb:266:in `output'",
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/pipeline.rb:225:in `outputworker'",
    "//scratch/LOGSTASH/logstash-1.4.2/lib/logstash/pipeline.rb:152:in `start_outputs'"], 
:level=>:warn}

1 个答案:

答案 0 :(得分:0)

我可以通过在命令行中传递max open file参数来解决此问题,同时启动ES作为&gt;&gt; bin / elasticsearch -Des.max-open-files = true -f