Filebeat未将日志发送到Logstash

时间:2018-07-23 08:41:25

标签: elasticsearch kibana logstash-configuration filebeat

我正在使用filebeat和ELK堆栈。我没有将日志从filebeat传递到logstach。谁能帮忙。

Filebeaat版本:6.3.0 ELK版本:6.0.0

filebeat配置:- filebeat.prospectors:

- type: log
  enabled: true
  paths:
    - '/var/lib/docker/containers/*/*.log'
  ignore_older: 0
  scan_frequency: 10s
  json.message_key: log
  json.keys_under_root: true
  json.add_error_key: true
  multiline.pattern: "^[[:space:]]+(at|\\.{3})\\b|^Caused by:"
  multiline.negate: false
  multiline.match: after
  registry_file: usr/share/filebeat/data/registry

output.logstash:     主持人:[“ 172.31.34.173:5044”]

Filebeat日志:-

2018-07-23T08:29:34.701Z        INFO    instance/beat.go:225    Setup Beat: filebeat; Version: 6.3.0
2018-07-23T08:29:34.701Z        INFO    pipeline/module.go:81   Beat name: ff01ed6d5ae4
2018-07-23T08:29:34.702Z        WARN    [cfgwarn]       beater/filebeat.go:61   DEPRECATED: prospectors are deprecated, Use `inputs` instead. Will be removed in version: 7.0.0
2018-07-23T08:29:34.702Z        INFO    [monitoring]    log/log.go:97   Starting metrics logging every 30s
2018-07-23T08:29:34.702Z        INFO    instance/beat.go:315    filebeat start running.
2018-07-23T08:29:34.702Z        INFO    registrar/registrar.go:75       No registry file found under: /usr/share/filebeat/data/registry. Creating a new registry file.
2018-07-23T08:29:34.704Z        INFO    registrar/registrar.go:112      Loading registrar data from /usr/share/filebeat/data/registry
2018-07-23T08:29:34.704Z        INFO    registrar/registrar.go:123      States Loaded from registrar: 0
2018-07-23T08:29:34.704Z        WARN    beater/filebeat.go:354  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018-07-23T08:29:34.704Z        INFO    crawler/crawler.go:48   Loading Inputs: 1
2018-07-23T08:29:34.705Z        INFO    log/input.go:111        Configured paths: [/var/lib/docker/containers/*/*.log]
2018-07-23T08:29:34.705Z        INFO    input/input.go:87       Starting input of type: log; ID: 2696038032251986622
2018-07-23T08:29:34.705Z        INFO    crawler/crawler.go:82   Loading and starting Inputs completed. Enabled inputs: 1
2018-07-23T08:30:04.705Z        INFO    [monitoring]    log/log.go:124  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20,"time":{"ms":22}},"total":{"ticks":50,"time":{"ms":60},"value":50},"user":{"ticks":30,"time":{"ms":38}}},"info":{"ephemeral_id":"5193ce7d-8d09-4e9d-ab4e-e55a5972b4

1 个答案:

答案 0 :(得分:0)

回答得有点晚,我知道但是我遇到了同样的问题,经过一番搜索,我发现这种布局对我来说很有效。

filebeat.prospectors:
- paths:
    - '<path to your log>'
  multiline.pattern: '<whatever pattern is needed>'
  multiline.negate: true
  multiline.match:  after
  processors:
  - decode_json_fields:
      fields: ['<whatever field you need to decode']
      target: json

这是一个类似问题的link