Docker:无法将数据从logstash容器发送到Kafka容器

时间:2016-08-18 20:52:50

标签: docker containers logstash apache-kafka docker-image

我有两个docker容器,一个运行Logstash,另一个运行Zookeeper和Kafka。我正在尝试将数据从Logstash发送到Kafka,但似乎无法将数据传输到我在Kafka中的主题。

我可以登录Docker Kafka容器并从终端生成一条消息到我的主题,然后也消耗它。

我正在使用输出kafka插件:

output {
    kafka {
        topic_id => "MyTopicName"
        broker_list => "kafkaIPAddress:9092"
    }
}

我从运行docker inspect kafka2

获得的ipAddress

当我运行./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf时,我收到此错误。

Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}

我通过运行以下返回OK的命令来检查文件的配置。

 ./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK

有没有人遇到过这个问题,是不是我打开了kafka容器上的端口,如果是这样的话我怎么能在保持Kafka运行的同时做到这一点?

1 个答案:

答案 0 :(得分:1)

错误在于broker_list => "kafkaIPAddress:9092"

尝试bootstrap_servers => "KafkaIPAddress:9092" 如果您将容器放在不同的计算机上,请将kafka映射到主机9092并使用主机地址:port,如果在同一主机上使用内部Docker IP:port