使用fluentd解析/ var / lib / docker / containter / * / *中的docker json-file日志作为常用方法?

时间:2018-04-19 08:03:20

标签: java docker logging kubernetes fluent

有一些方法可以收集docker或k8s的包含日志:

  1. 使用像gelf,fluent等流日志驱动程序,但这不能使用 docker logs 命令进行调试,所以这不是解决问题的好方法。 Send log to multiple log drivers Enhance local logging / remote logging
  2. 由于docker的日志驱动程序不支持多行,因此/var/lib/docker/containter//.json中的日志是逐行log driver should support multiline。< / p>

    1. 使用日志收集器程序,如filebeat,fluend,collector-sidecar,logstash来解析/ var / lib / docker / containter / / 中的docker json文件。但我还是找不到 收集日志的好方法。
    2. 经过几天的搜索,我认为流利是解决这个问题的最佳工具,并找到解决问题的两种方法。但我仍然在解决我的问题。

      1. 使用fluent-plugin-detect-exceptions + gelf,但这仍然是两条消息。 Docker logs (Java stack trace that are long) NOT meaningful in Splunk Is available to concat stack trace and previous line with error message
      2. 原始日志:

         {"log":"2018-04-19 14:19:57,915 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream":"stdou
         t","time":"2018-04-19T06:19:57.916259717Z"}
         {"log":"2018-04-19 14:19:57,915 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.ws.WebSocketClientManager: send ws message -{TAB} for envirnem -{{\"res\":\"heartbeat\"}}\n","stream
         ":"stdout","time":"2018-04-19T06:19:57.916265977Z"}
         {"log":"2018-04-19 14:20:43,446 ERROR [FixedTimeScheduler] com.testjavatest.fastdemo.task.JobTask: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed o
         ut (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448436321Z"}
         {"log":"org.apache.http.conn.HttpHostConnectException: Connect to cloud.testjavatest.com:5002 [cloud.testjavatest.com/10.111.2.77] failed: Connection timed out (Connection timed out)\n","stream":"s
         tdout","time":"2018-04-19T06:20:43.448475801Z"}
         {"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:151)\n","stream":"stdout","time":"2018-04-19T06:20:43.4484860
         6Z"}
         {"log":"\u0009at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)\n","stream":"stdout","time":"2018-04-19T06:20:43.448492586
         Z"}
         {"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)\n","stream":"stdout","time":"2018-04-19T06:20:43.448498085Z"}
         {"log":"\u0009at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)\n","stream":"stdout","time":"2018-04-19T06:20:43.448503302Z"}
         {"log":"\u0009at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.4485085Z"}
         {"log":"\u0009at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)\n","stream":"stdout","time":"2018-04-19T06:20:43.448527373Z"}
         {"log":"\u0009at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)\n","stream":"stdout","time":"2018-04-19T06:20:43.448532363Z"}
         {"log":"\u0009at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)\n","stream":"stdout","time":"2018-04-19T06:20:43.44853704Z"}
         {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)\n","stream":"stdout","time":"2018-04-19T06:20:43.448541864Z"}
         {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)\n","stream":"stdout","time":"2018-04-19T06:20:43.448546452Z"}
         {"log":"\u0009at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)\n","stream":"stdout","time":"2018-04-19T06:20:43.448551288Z"}
         {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:308)\n","stream":"stdout","time":"2018-04-19T06:20:43.448555115Z"}
         {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.send(MyHttpClient.java:264)\n","stream":"stdout","time":"2018-04-19T06:20:43.448559028Z"}
         {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvBytes(MyHttpClient.java:470)\n","stream":"stdout","time":"2018-04-19T06:20:43.448563048Z"}
         {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvString(MyHttpClient.java:474)\n","stream":"stdout","time":"2018-04-19T06:20:43.448567524Z"}
         {"log":"\u0009at com.testjavatest.httpclient.MyHttpClient.recvJSON(MyHttpClient.java:483)\n","stream":"stdout","time":"2018-04-19T06:20:43.448571872Z"}
         {"log":"\u0009at com.testjavatest.fastdemo.task.JobTask.run(JobTask.java:70)\n","stream":"stdout","time":"2018-04-19T06:20:43.448576282Z"}
         {"log":"\u0009at com.testjavatest.fastdemo.task.timer.FixedTimeScheduler.run(FixedTimeScheduler.java:43)\n","stream":"stdout","time":"2018-04-19T06:20:43.44858108Z"}
         {"log":"Caused by: java.net.ConnectException: Connection timed out (Connection timed out)\n","stream":"stdout","time":"2018-04-19T06:20:43.448585665Z"}
         {"log":"\u0009at java.net.PlainSocketImpl.socketConnect(Native Method)\n","stream":"stdout","time":"2018-04-19T06:20:43.448590045Z"}
         {"log":"\u0009at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)\n","stream":"stdout","time":"2018-04-19T06:20:43.448596151Z"}
         {"log":"\u0009at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)\n","stream":"stdout","time":"2018-04-19T06:20:43.448600888Z"}
         {"log":"\u0009at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)\n","stream":"stdout","time":"2018-04-19T06:20:43.448605225Z"}
         {"log":"\u0009at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)\n","stream":"stdout","time":"2018-04-19T06:20:43.448609704Z"}
         {"log":"\u0009at java.net.Socket.connect(Socket.java:589)\n","stream":"stdout","time":"2018-04-19T06:20:43.448614111Z"}
         {"log":"\u0009at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:337)\n","stream":"stdout","time":"2018-04-19T06:20:43.448618537Z"}
         {"log":"\u0009at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)\n","stream":"stdout","time":"2018-04-19T06:20:43.4486230
         34Z"}
         {"log":"\u0009... 17 more\n","stream":"stdout","time":"2018-04-19T06:20:43.448627326Z"}
         {"log":"2018-04-19 14:20:57,934 INFO  [FixedTimeScheduler] com.testjavatest.fastdemo.config.WsJobTask: ws heartbeat run\n","stream":"stdout","time":"2018-04-19T06:20:57.934646438Z"}
        

        gelf的期望输出:

        一个java事件是一个不是多行的事件。

        我的流利配置是:

          <source>
          @id fluentd-containers.log
          @type tail
          path "/var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log"
          pos_file "/var/log/fluentd-containers.log.pos"
          time_format %Y-%m-%dT%H:%M:%S.%NZ
          tag "raw.docker.*"
          format json
          read_from_head true
          <parse>
            time_format %Y-%m-%dT%H:%M:%S.%NZ
            @type json
            time_type string
          </parse>
        </source>
        <match raw.docker.**>
          @id raw.docker
          @type detect_exceptions
          remove_tag_prefix "raw"
          message "log"
          stream "stream"
          languages java, python
          multiline_flush_interval 5
          max_bytes 0
          max_lines 0
        </match>
        <match docker.**>
          @type copy
          <store>
            @type "gelf"
            protocol "udp"
            host "192.168.2.4"
            port 12206
            flush_interval 5s
            <buffer>
              flush_mode interval
              retry_type exponential_backoff
              flush_interval 5s
            </buffer>
          </store>
          <store>
            @type "stdout"
          </store>
        </match>
        

        但这不起作用,有人可以帮我一把吗?

        1. 使用fluent-plugin-grok-parser + fluent-plugin-concat + gelf,但这仍然无效。 帮助信息:Handle multiline with empty line from kubernetes/docker with --log-driver=json-fileMultiple grok patterns with multiline not parsing the log the same question 消息如上所述,期望输出如上所述。
        2. 我的流利配置是:

             <source>
             @id fluentd-containers.log
             @type tail
             from_encoding UTF-8
             encoding UTF-8
             path /var/lib/docker/containers/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b/52b26345771ff24b76840a57f26808562aea0e917d72746d48aac6f9870b901b-json.log
             pos_file /var/log/fluentd-containers.log.pos
             time_format %Y-%m-%dT%H:%M:%S.%NZ
             tag raw.docker.*
             format json
             read_from_head true
           </source>
           <filter raw.docker.**> 
             @type grep
             regexp1 stream stdout
             @type concat
             key log
             multiline_start_regexp /^\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d+/ 
             continuous_line_regexp /^\s+/ 
             separator ""
             flush_interval 3s
           </filter>
           <filter raw.docker.**>
             @type parser
             key_name log
             inject_key_prefix log.
             <parse>
               @type multiline_grok
               grok_failure_key grokfailure
               <grok>
                 pattern /%{TIMESTAMP_ISO8601:log_time}%{SPACE}%{LOGLEVEL:log_level}%{SPACE}\[%{DATA:threadname}\]%{SPACE}%{DATA:classname}%{SPACE}:%{SPACE}%{GREEDYDATA:log_message}/ 
               </grok>
             </parse>
           </filter>
           <match raw.docker.**>
             @type copy
             <store>
               @type gelf
               protocol udp
               host 10.111.2.4
               port 12204
               flush_interval 5s
             </store>
             <store>
               @type stdout
             </store>
             <store>
               @type file
               path /var/log/test
             </store> 
           </match>    
          

          但这不行,有人可以帮我一把吗?感谢。

          PS:我想知道是否有更好的方法来收集泊坞日志?

0 个答案:

没有答案