Docker:将正在容器中写入的日志文件发送到ELK堆栈

时间:2019-10-24 15:59:46

标签: docker elasticsearch docker-container

我正在使用django运行docker应用程序,并使用python登录django设置的/path/to/workdir/logs/django.log在日志文件夹中写入api日志。重新启动容器时,我的日志文件也将被删除(这是可以理解的)。

我想将日志(例如elasticsearch)发送到/var/lib/docker/containers/*/*.log。我很困惑,因为我的搜索告诉我要发送此路径 headers: [ { text: "Name", sortable: true, value: "name" }, { text: "Lastname", sortable: true, value: "last_name" }, { text: "Email", sortable: true, value: "email" }, { text: "Role", sortable: true, value: "roles[0].name" }, { text: "Actions", sortable: false, align: "center" } ], ,但我认为这不是我想要的。

关于我如何将容器中的日志运送到ELK Stack的任何想法?

1 个答案:

答案 0 :(得分:1)

您可以使用gelf日志记录驱动程序将日志从docker容器stdout / stderr运到elasticsearch

使用gelf日志记录驱动程序(docker-compose.yml)配置服务:

version: '3.7'
x-logging:
  &logstash
  options:
    gelf-address: "udp://localhost:12201"
  driver: gelf
services:
  nginx:
    image: 'nginx:1.17.3'
    hostname: 'nginx'
    domainname: 'example.com'
    depends_on:
    - 'logstash'
    ports:
    - '80:80'
    volumes:
    - '${PWD}/nginx/nginx.conf:/etc/nginx/nginx.conf:ro'
    logging: *logstash
  elasticsearch:
    image: 'elasticsearch:7.1.1'
    environment:
    - 'discovery.type=single-node'
    volumes:
    - 'elasticsearch:/usr/share/elasticsearch/data'
    expose:
    - '9200'
    - '9300'
  kibana:
    image: 'kibana:7.1.1'
    depends_on:
    - 'elasticsearch'
    ports:
    - '5601:5601'
    volumes:
    - '${PWD}/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml'
  logstash:
    build: 'logstash'
    depends_on:
    - 'elasticsearch'
    volumes:
    - 'logstash:/usr/share/logstash/data'
    ports:
    - '12201:12201/udp'
    - '10514:10514/udp'
volumes:
  elasticsearch:
  logstash:

注意:以上示例使用extension fields配置日志记录。

此示例使用的minimal nginx.conf

user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log debug;

pid /var/run/nginx.pid;

events {
    worker_connections  1024;
}

http {
  server {
    listen 80;
    server_name _;

    location / {
      return 200 'OK';
    }
  }
}

logstash图像是使用以下Dockerfile的自定义版本:

FROM logstash:7.1.1

USER 0
COPY pipeline/gelf.cfg /usr/share/logstash/pipeline
COPY pipeline/pipelines.yml /usr/share/logstash/config
COPY settings/logstash.yml /usr/share/logstash/config
COPY patterns /usr/share/logstash/patterns

RUN rm /usr/share/logstash/pipeline/logstash.conf
RUN chown -R 1000:0 /usr/share/logstash/pipeline /usr/share/logstash/patterns /usr/share/logstash/config
USER 1000

...相关的logstash gelf plugin配置:

input {
  gelf {
    type => docker
    port => 12201
  }
}

filter { }

output {
  if [type] == "docker" {
    elasticsearch { hosts => ["elasticsearch:9200"] }
    stdout { codec => rubydebug }
  }
}

...和pipelines.yml

- pipeline.id: "gelf"
  path.config: "/usr/share/logstash/pipeline/gelf.cfg"

容器中运行的进程将日志记录到stdout / stderrdocker使用logstash日志记录驱动程序将日志记录推送到gelf注意logstash地址为localhost,因为docker服务发现不可用于解析服务名称-端口必须映射到主机,日志记录驱动程序必须可以使用localhost进行配置),它将日志输出到elasticsearch,您可以在kibana中建立索引:

enter image description here