我有存储在json文件中的测试结果。然后,我让logstash查找文件,然后尝试将所有行发送到elasticsearch。仅发送了大约一半的行,无法弄清为什么某些行被遗漏了。例如,将有34行,但仅发送14条。
input {
file {
path => "/data/*.json"
start_position => "beginning"
}
}
# ----------------------------------------------------------------------
filter {
# Parse fields out of JSON message, then remove the raw JSON.
json {
source => "message"
}
}
# ----------------------------------------------------------------------
output {
elasticsearch {
hosts => ["host:9200", "localhost:9200"]
index => "ct-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
我不确定json本身中是否有导致logstash跳过的东西,或者我上面发布的logstash.conf文件是否有问题。
答案 0 :(得分:0)
Logstash计算来自不同类型的文件,然后以Json格式将其发送到elasticsearch。在您的情况下,具有Elasticsearch输出的Filebeat代理足以将json文件发送到ES并为其编制索引。
使用Filebeat 6.x看起来像这样:
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
# Paths to the logs
paths:
- "/your/path/to/your/logs/file.json"
# tags to identify the logs source, .gz files excluded from the prospector
tags: ["beats","yourtag"]
exclude_files: ['\.gz$']
#================================ Outputs =====================================
#----------------------------- Elasticsearch output --------------------------------
output.elasticsearch:
# The ES host & index name
hosts: ["yourEShost:9200"]
index: "ct-%{+YYYY.MM.dd}"