所以我看到还有其他一些类型的问题,但似乎都没有解决我的问题。
我正在尝试从文件中获取Springboot日志,解析有用信息,并将结果发送到Elasticsearch,最终从Kibana中读取。我的fluentd.conf如下所示:
<source>
type tail
read_from_head true
path /path/to/log/
pos_file /path/to/pos_file
format /^(?<date>[0-9]+-[0-9]+-[0-9]+\s+[0-9]+:[0-9]+:[0-9]+.[0-9]+)\s+(?<log_level>[Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)\s+(?<pid>[0-9]+)\s+---\s+(?<message>.*)$/
tag my.app
</source>
<match my.app>
type stdout
</match>
<match my.app>
type elasticsearch
logstash_format true
host myhosthere
port 9200
index_name fluentd-app
type_name fluentd
</match>
给出典型的Springboot日志行:
2015-07-16 19:20:04.074 INFO 16649 --- [ main] {springboot message}
通过写入stdout作为测试,我看到我的解析器导致:
{
"date":"2015-07-16 19:20:04.074",
"log_level":"INFO",
"pid":"16649",
"message":"[ main] {springboot message}"
}
然而,当这被写入Elasticsearch时,所有结果都是:
{
_index: "fluentd-app-2015.07.16",
_type: "fluentd",
_id: "AU6YT5sjvkxiJXWCxeM8",
_score: 1,
_source: {
message: "2015-07-16 19:20:04.074 INFO 16649 --- [ main] {springboot message}",
@timestamp: "2015-07-16T19:20:04+00:00"
}
},
从我读过的fluentd-plugin-elasticsearch开始,我希望_source包含我在stdout中看到的所有解析字段。我也尝试过grok解析器 - 虽然看起来很明显问题在于对流利的elasticsearch插件的理解。如何将我解析的字段保存到elasticsearch?