logstash 2.3.2连接到zookeeper

时间:2017-03-31 09:27:34

标签: connection apache-kafka logstash apache-zookeeper consumer

使用docker

安装logstash

想成为:logstash将kafka和stdout的数据用于控制台

规格

Kafka 0.8.2.1

Logstash 2.3.2

Elasticsearch 2.3.3

泊坞

Logstash代码

input {
  kafka {
    zk_connect => 'remoteZookeeperServer:2181'
    topic_id => 'testTopic'
  }
}

output {
  stdout { codec => rubydebug }
}

Docker日志

{:timestamp=>"2017-03-31T09:13:01.120000+0000", :message=>"Pipeline main started"}
// finished

Docker日志(使用调试模式执行logstash)

{:timestamp=>"2017-03-31T08:23:41.987000+0000", :message=>"Reading config file", :config_file=>"/config-dir/logstash-wcs.conf", :level=>:debug, :file=>"logstash/config/loader.rb", :line=>"69", :method=>"local_config"}
{:timestamp=>"2017-03-31T08:23:42.022000+0000", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"kafka", :path=>"logstash/inputs/kafka", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2017-03-31T08:23:42.363000+0000", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"json", :path=>"logstash/codecs/json", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2017-03-31T08:23:42.370000+0000", :message=>"config LogStash::Codecs::JSON/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}
...
// cannot find warn or error
...
{:timestamp=>"2017-03-31T08:23:42.549000+0000", :message=>"Will start workers for output", :worker_count=>1, :class=>LogStash::Outputs::Stdout, :level=>:debug, :file=>"logstash/output_delegator.rb", :line=>"77", :method=>"register"}
{:timestamp=>"2017-03-31T08:23:42.553000+0000", :message=>"Starting pipeline", :id=>"main", :pipeline_workers=>4, :batch_size=>125, :batch_delay=>5, :max_inflight=>500, :level=>:info, :file=>"logstash/pipeline.rb", :line=>"188", :method=>"start_workers"}
{:timestamp=>"2017-03-31T08:23:42.561000+0000", :message=>"Pipeline main started", :file=>"logstash/agent.rb", :line=>"465", :method=>"start_pipeline"}
{:timestamp=>"2017-03-31T08:23:47.565000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:23:52.566000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:23:57.568000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:24:02.570000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:24:07.573000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
...

注意

当我使用具有相同zookeeper服务器的kafka2.9.1-0.8.2.1控制台使用者进行测试时,消费者使用了数据。

我认为logstash无法连接到zookeeper服务器并使用kafka的数据。

有什么问题?

为什么logstash会创建与zookeeper服务器连接的日志?

0 个答案:

没有答案