我是Elasticsearch,Kibana和Logstash的新手。我正在尝试加载像这样的json文件:
{"timestamp":"2014-05-19T00:00:00.430Z","memoryUsage":42.0,"totalMemory":85.74,"usedMemory":78.77,"cpuUsage":26.99,"monitoringType":"jvmHealth"}
{"timestamp":"2014-05-19T00:09:10.431Z","memoryUsage":43.0,"totalMemory":85.74,"usedMemory":78.77,"cpuUsage":26.99,"monitoringType":"jvmHealth"}
{"timestamp":"2014-05-19T00:09:10.441Z","transactionTime":1,"nbAddedObjects":0,"nbRemovedObjects":0,"monitoringType":"transactions"}
{"timestamp":"2014-05-19T00:09:10.513Z","transactionTime":6,"nbAddedObjects":4,"nbRemovedObjects":0,"monitoringType":"transactions"}
没有创建索引,我只收到此消息:
使用里程碑2输入插件'文件'。这个插件应该稳定, 但如果你看到奇怪的行为,请告诉我们!更多 有关插件里程碑的信息,请参阅 http://logstash.net/docs/1.4.1/plugin-milestones {:level =>:warn}
可能是什么问题?我可以直接使用批量,但我必须使用logstash。 你有任何建议的代码可以帮助吗?
编辑(将配置从评论移到问题中):
input {
file {
path => "/home/ndoye/Elasticsearch/great_log.json"
type => json
codec => json
}
}
filter {
date {
match => ["timestamp","yyyy-MM-dd HH:mm:ss.SSS"]
}
}
output {
stdout{
#codec => rubydebug
}
elasticsearch {
embedded => true
}
}