我正在使用流利的和winston将所有日志记录到文件和elasticsearch中。我正在从服务器记录JSON数据。以下是登录文件的内容。
2018-06-06T15:54:37+05:30 service.api {"level":"infp","message":"{ user: null,\n serviceid: null,\n path: null,\n method: null,\n agent: null,\n code: null,\n clientip: null,\n size: null,\n message: 'API LOGGER INITIATED',\n http_x_forwarded_for: null,\n parameters: null,\n query: null,\n body: null,\n controller: null,\n controller_method: null,\n correlation_id: null,\n fileInfo: null,\n time: 1528280677002 }","meta":{"end":true}}
我需要从文件中删除\ n并将其格式化为JSON而不是字符串。还想将字段“message”重命名为“log”。
以下是我精通的配置
## SYSTEM CONFIGURATION
<system>
<log>
format json
time_format %Y-%m-%d
</log>
</system>
########################################################
#### External plugins ##################################
#### 1. fluent-plugin-forest ###########################
#### https://github.com/tagomoris/fluent-plugin-forest##
########################################################
## INFO LOGS
<source>
@type forward
bind 127.0.0.1
port 8001
</source>
<match service.**>
@type copy
<store>
## writing logs to file as per microservice
@type forest
subtype file
<template>
append true
compress gzip
<format>
@type json
add_newline true // tried with both true/false
message_key record true // Not working
</format>
<buffer time>
# @type file
timekey 1
timekey_wait 1
retry_max_times 5
</buffer>
path /var/log/fluentd/__TAG__/log/file
</template>
</store>
<store>
@type grep
key level
# pattern verbose
add_tag_prefix access
</store>
</match>
## Writing logs to elastic search
<match access.**>
@type elasticsearch
host localhost
port 9200
logstash_format true
</match>