Filebeat无法将日志发送到Logstash并出现日志/收录器错误

时间:2019-03-30 22:51:04

标签: docker logstash filebeat

我正在遵循本教程,以通过filebeat和logstash Link to tutorial

从存储在elasticsearch中的Docker容器中获取日志。

但是,在kibana中什么也没有显示,当我在filebeat容器上运行docker-logs时,出现以下错误

2019-03-30T22:22:40.353Z        ERROR   log/harvester.go:281    Read line error: parsing CRI timestamp: parsing time "-03-30T21:59:16,113][INFO" as "2006-01-02T15:04:05Z07:00": cannot parse "-03-30T21:59:16,113][INFO" as "2006"; File: /usr/share/dockerlogs/data/2f3164397450efdd5851c3fad67fe405ab3dd822bbea1d807a993844e9143d5e/2f3164397450efdd5851c3fad67fe405ab3dd822bbea1d807a993844e9143d5e-json.log

我的容器托管在Linux虚拟机上,该虚拟机在Windows计算机上运行(不确定由于指定的位置是否可能导致错误)

下面我将描述正在运行的文件以及一些文件,以防将来将来删除该文章等

一个容器正在运行,该容器仅运行以下命令,打印出filebeat应该能够读取的行

CMD while true; do sleep 2 ; echo "{\"app\": \"dummy\", \"foo\": \"bar\"}"; done

我的filebeat.yml文件如下

filebeat.inputs:
- type: docker
  combine_partial: true
  containers:
    path: "/usr/share/dockerlogs/data"
    stream: "stdout"
    ids:
      - "*"
  exclude_files: ['\.gz$']
  ignore_older: 10m

processors:
  # decode the log field (sub JSON document) if JSON encoded, then maps it's fields to elasticsearch fields
- decode_json_fields:
    fields: ["log", "message"]
    target: ""
    # overwrite existing target elasticsearch fields while decoding json fields    
    overwrite_keys: true
- add_docker_metadata:
    host: "unix:///var/run/docker.sock"

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

# setup filebeat to send output to logstash
output.logstash:
  hosts: ["logstash"]

# Write Filebeat own logs only to file to avoid catching them with itself in docker log files
logging.level: error
logging.to_files: false
logging.to_syslog: false
loggins.metrice.enabled: false
logging.files:
  path: /var/log/filebeat
  name: filebeat
  keepfiles: 7
  permissions: 0644
ssl.verification_mode: none

有关filebeat为什么无法转发我的日志以及如何解决它的任何建议,我们将不胜感激。谢谢

0 个答案:

没有答案