我跟着the tutorial found here设置了ELK堆栈,最终我设法让一切正常。但是,当我尝试修改系统以读入CSV文件时,它完全停止工作。 conf文件如下所示:
input {
file {
path => "/home/user/remaining/path/*.csv"
type => "transaction"
start_position => "beginning"
}
}
filter {
if [type] == "transaction" {
csv {
columns => ["@timestamp", "ip address", "domain", "user", "demo", "id", "activity", "detail"]
separator => ","
}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "logstash-test"
workers => 1
}
}
我已经尝试了一些教程和指南来设置它,据我所知,logstash仍然连接到elasticsearch。我只是没有看到任何输出。我怀疑它根本就没有读取文件。部分问题是我不确定如何测试logstash的每个组件。有什么我可能错过的吗?
编辑: csv文件通常看起来像这样:
2016-02-29T22:26:39.319700,22.111.11.11,place.domain.ca,bob,DEMO,95081299250aa8,TI_START,"{'field': 'data', 'field2': 'moredata', 'anotherfield': 'evenmoredata', 'continuedfield': 'habbo', 'size': '16'}"
2016-02-29T22:27:00.098426,24.111.11.11,otherplace.domain.ca,bob,DEMO,390s8clearlyfake,TI_END,"{'field': 'data', 'field2': 'moredata', 'anotherfield': 'evenmoredata', 'continuedfield': 'habbo', 'size': '16'}"
我也注意到当我去localhost时:9200 / logstash-test /我得到了404.我不确定这是不是因为没有数据传输,或者它是否存在不同的连接问题。
答案 0 :(得分:0)
作为讨论的一部分:https://discuss.elastic.co/t/logstash-not-showing-any-output-solved/28636/16
我的文件超过24小时,这是文件输入的默认到期时间。通过更改以下内容来修复,其中ignore_older是默认值86400:
input {
file {
ignore_older => 864000
path => "/home/sean/cost-logs/transaction/*.csv"
type => "transaction"
start_position => "beginning"
}
}