我具有以下logstash配置:
input {
file {
codec => "json_lines"
path => ["/etc/logstash/input.log"]
sincedb_path => "/etc/logstash/dbfile"
start_position => "beginning"
ignore_older => "0"
}
}
output {
elasticsearch {
hosts => ["192.168.169.46:9200"]
}
stdout {
codec => rubydebug
}
}
/etc/logstash/input.log
文件填充有来自运行中的Java应用程序的日志。日志采用以下json格式(它们以\n
字符内联写入):
{
"exception": {
"exception_class": "java.lang.RuntimeException",
"exception_message": "Test runtime exception stack: 0",
"stacktrace": "java.lang.RuntimeException: Test runtime exception stack: 0"
},
"@version": 1,
"source_host": "WS-169-046",
"message": "Test runtime exception stack: 0",
"thread_name": "parallel-1",
"@timestamp": "2019-12-02T16:30:14.084+02:00",
"level": "ERROR",
"logger_name": "nl.hnf.logs.aggregator.demo.LoggingTest",
"aplication-name": "demo-log-aggregation"
}
我还使用elasticsearch API(将请求正文放置在http://192.168.169.46:9200/_template/logstash?pretty
上)更新了logstash默认模板:
{
"index_patterns": "logstash-*",
"version": 60002,
"settings": {
"index.refresh_interval": "5s",
"number_of_shards": 1
},
"mappings": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
],
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "keyword"
},
"source_host": {
"type": "keyword"
},
"message": {
"type": "text"
},
"thread_name": {
"type": "text"
},
"level": {
"type": "keyword"
},
"logger_name": {
"type": "keyword"
},
"aplication_name": {
"type": "keyword"
},
"exception": {
"dynamic": true,
"properties": {
"exception_class": {
"type": "text"
},
"exception_message": {
"type": "text"
},
"stacktrace": {
"type": "text"
}
}
}
}
}
Elasticsearch用"acknowledged": true
响应,我可以看到该模板正在通过API更新。
现在以debug
日志级别启动logstash,我看到输入日志正在读取但没有发送到elasticsearch,尽管索引已创建但始终为空(0个文档):
[2019-12-03T09:30:51,655][DEBUG][logstash.inputs.file ][custom] Received line {:path=>"/etc/logstash/input.log", :text=>"{\"@version\":1,\"source_host\":\"ubuntu\",\"message\":\"Generating some logs: 65778 - 2019-12-03T09:30:50.775\",\"thread_name\":\"parallel-1\",\"@timestamp\":\"2019-12-03T09:30:50.775+00:00\",\"level\":\"INFO\",\"logger_name\":\"nl.hnf.logs.aggregator.demo.LoggingTest\",\"aplication-name\":\"demo-log-aggregation\"}"}
[2019-12-03T09:30:51,656][DEBUG][filewatch.sincedbcollection][custom] writing sincedb (delta since last write = 1575365451)
此外,elasticsearch日志也处于debug
级别,但是我看不到那里有任何错误或任何可以提示我有关问题根源的信息。
你们对为什么不将日志推送到elasticsearch有任何想法或建议吗?
答案 0 :(得分:0)
在filebeat中,将ignore_older设置为零意味着“不检查文件的年代”。对于logstash文件输入,它的意思是“忽略超过零秒的文件”,实际上是“忽略所有内容”。删除它。如果这样做没有帮助,请增加要跟踪的日志级别,并查看filewatch模块对正在监视的文件的说明。
答案 1 :(得分:0)
使用json
编解码器而不是json_lines
并删除了start_position
,ignore_older
和sincedb_path
input {
file {
codec => "json"
path => ["/etc/logstash/input.log"]
}
}
output {
elasticsearch {
hosts => ["192.168.169.46:9200"]
}
stdout {
codec => rubydebug
}
}
json_lines
编解码器似乎与文件输入不兼容(\n
分隔符无法正常工作)