我在Kubernetes集群中使用EFK堆栈来解析nginx控制器的日志。虽然我可以在Fluentd中指定字段的类型并将其正确交付给Elasticsearch,但Kibana不能将该字段识别为数字。
有效配置
<filter kubernetes.var.log.containers.nginx-ingress-controller-**.log>
@type parser
format /(?<remote_addr>[^ ]*) - \[(?<proxy_protocol_addr>[^ ]*)\] - (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<request>[^\"]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*) "(?<referer>[^\"]*)" "(?<agent>[^\"]*)" (?<request_length>[^ ]*) (?<request_time>[^ ]*) \[(?<proxy_upstream_name>[^ ]*)\] (?<upstream_addr>[^ ]*) (?<upstream_response_length>[^ ]*) (?<upstream_response_time>[^ ]*) (?<upstream_status>[^ ]*) (?<upstream_id>[^ ]*)/
time_format %d/%b/%Y:%H:%M:%S %z
key_name log
# Retain the original "log" field after parsing out the data.
reserve_data true
# These get sent to ES as the correct types.
types request_length:integer,request_time:float
</filter>
它们在文档中显示为数字:
"request_length": 426,
"request_time": 0.007,
但是Kibana仍然称它们为字符串
我刷新了索引,甚至在Kibana中删除并重新创建了该索引,但仍然相同。