Logstash中的增量加载

时间:2016-12-09 00:14:02

标签: elasticsearch logstash

我正在尝试使用Logstash从日志文件(带[]括号中的列)加载新记录。文件名和位置将保持不变,但新记录将附加到文件的底部。 Logstash需要持续监控和加载新记录。非常感谢你的帮助!

示例:FILE1.log

[2016-12-08T18:08:22,779][INFO ][o.e.t.TransportService   ] [58Z8vjS] publish_address {127.0.0.1:9300}, bound_addresses {[::1]:9300}, {127.0.0.1:9300}
[2016-12-08T18:08:22,788][WARN ][o.e.b.BootstrapCheck     ] [58Z8vjS] max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144]
[2016-12-08T18:08:26,093][INFO ][o.e.c.s.ClusterService   ] [58Z8vjS] new_master {58Z8vjS}{58Z8vjS8TviDSa9V1Yg-5w}{u4upnHFyS7GQ6UcxQb3Rvw}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2016-12-08T18:08:26,241][INFO ][o.e.h.HttpServer         ] [58Z8vjS] publish_address {127.0.0.1:9200}, bound_addresses {[::1]:9200}, {127.0.0.1:9200}
[2016-12-08T18:08:26,241][INFO ][o.e.n.Node               ] [58Z8vjS] started
[2016-12-08T18:08:27,486][INFO ][o.e.g.GatewayService     ] [58Z8vjS] recovered [10] indices into cluster_state

0 个答案:

没有答案