我有一个redis数据库,logstash和两个弹性搜索潮流数据库。 我正在将密钥从redis转移到elasticsearch,它工作正常,并且想要测试不起作用的Influxdb。
有任何人为涌入数据库提供了有效的连接器,或者我应该如何使用redis提供数据来使这件事有效?
这是我的涌入数据库连接,只会引发错误
influxdb {
host => "localhost"
measurement => "sensor1"
allow_time_override => true
use_event_fields_for_data_points => true
exclude_fields => ["@version", "@timestamp", "sequence", "type", "host"]
}
这是我的redis连接,工作正常
redis
{
host => "localhost"
data_type => "list"
key => "vortex"
threads => 4
type => "testrecord"
codec => "plain"
}
我试过这种格式
“sensor1,measure = 1 1489594615.9747” 作为redis的列表,例如
key: vortex
values:
sensor1,measure=1 1489594615.9747
sensor1,measure=1 1489594615.9747
sensor1,measure=1 1489594615.9747
sensor1,measure=1 1489594615.9747
sensor1,measure=1 1489594615.9747
....
但这也行不通。
有没有人如何通过logstash从redis获取数据到flowxdb?
答案 0 :(得分:1)
替换保留政策
要使用的保留政策
config:retention_policy,:validate => :string,:default => “autogen”
// config:retention_policy,:validate => :string,:default => “default”<< orignial
在redis中我使用了以下字符串格式(没有时间戳) foo = 70617 bar = 3
logsash的配置文件如下所示 - 只是作为如何操作的示例,以便它最终起作用: - )
input {
redis{
host => "localhost"
data_type => "list"
key => "vortex"
threads => 4
type => "testrecord"
codec => "plain"
}
}
filter {
kv {
add_field => {
"test1" => "yellow=cat"
"test=space" => "making= life=hard"
"feild= space" => "pink= dog"
}
}
}
output {
stdout { codec => rubydebug }
influxdb {
host => "localhost"
measurement => "myseries"
allow_time_override => true
use_event_fields_for_data_points => true
exclude_fields => ["@version", "@timestamp", "sequence", "message", "type", "host"]
send_as_tags => ["bar", "baz", "test1", "test=space"]
}
}