嘿,我使用了这个turorial设置jmeter&使用kibana进行弹性搜索的logstash:http://ecmarchitect.com/archives/2014/09/09/3932
第一次一切正常。创建了一个新的jmeter-results
索引,并使用我的jmeter数据用logstash填充。
今天我尝试了新的jmeter数据,但没有任何反应。
没有发生错误,但在logstash日志中我可以看到_discover_file_glob
一遍又一遍地记录下来。这是我日志的重要部分:
Registering file input {:path=>["/etc/apache-jmeter-2.12/bin/log2.jtl"], :level=>:info, :file=>"logstash/inputs/file.rb", :line=>"74"}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_66c8ea3a6e5fbda3879299a795b893d5", :path=>["/etc/apache-jmeter-2.12/bin/log2.jtl"], :level=>:info, :file=>"logstash/inputs/file.rb", :line=>"115"}
Pipeline started {:level=>:info, :file=>"logstash/pipeline.rb", :line=>"78"}
_sincedb_open: reading from /root/.sincedb_66c8ea3a6e5fbda3879299a795b893d5 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"199"}
_sincedb_open: setting [33297239, 0, 2306] to 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"203"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file: /etc/apache-jmeter-2.12/bin/log2.jtl: new: /etc/apache-jmeter-2.12/bin/log2.jtl (exclude is []) {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"126"}
_open_file: /etc/apache-jmeter-2.12/bin/log2.jtl: opening {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"98"}
/etc/apache-jmeter-2.12/bin/log2.jtl: sincedb last value 44106, cur size 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"122"}
/etc/apache-jmeter-2.12/bin/log2.jtl: sincedb: seeking to 44106 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"124"}
writing sincedb (delta since last write = 1421612560) {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"177"}
/etc/apache-jmeter-2.12/bin/log2.jtl: file grew, old size 0, new size 44106 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"81"}
Automatic template management enabled {:manage_template=>"true", :level=>:info, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"104"}
Template Search URL: {:template_search_url=>"http://localhost:9200/_template/*", :level=>:debug, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"112"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
_discover_file_glob: /etc/apache-jmeter-2.12/bin/log2.jtl: glob is: ["/etc/apache-jmeter-2.12/bin/log2.jtl"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"117"}
我在互联网上看到解决方案是删除.sincedb_
文件,但仍然没有任何反应。
也许有人可以帮助我?
答案 0 :(得分:1)
将start_position
设置为beginning
input section
logStash
以再次开始处理相同的 CSV文件:< / p>
input {
file {
path => [ "/CSV_File.csv"]
type => "JMeterlog"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
if ([message] =~ "responseCode") {
drop { }
}
else {
csv { columns => ["timeStamp", "elapsed", "label", "responseCode", "responseMessage", "threadName", "dataType", "success", "bytes", "grpThreads", "allThreads", "URL", "Latency", "SampleCount", "ErrorCount", "IdleTime"]}
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-jmeter-results-%{+YYYY.MM.dd}"
template => "jmeter-results-mapping.json"
template_name => "logstash-jmeter-results"
template_overwrite => false
}
}