我有以下CSV文件
tstp,voltage_A_real,voltage_B_real,voltage_C_real #header not present in actual file
2000-01-01 00:00:00,2535.53,-1065.7,-575.754
2000-01-01 01:00:00,2528.31,-1068.67,-576.866
2000-01-01 02:00:00,2528.76,-1068.49,-576.796
2000-01-01 03:00:00,2530.12,-1067.93,-576.586
2000-01-01 04:00:00,2531.02,-1067.56,-576.446
2000-01-01 05:00:00,2533.28,-1066.63,-576.099
2000-01-01 06:00:00,2535.53,-1065.7,-575.754
2000-01-01 07:00:00,2535.53,-1065.7,-575.754
....
我正在尝试通过logstash将数据插入到elasticsearch中,并具有以下logstash配置
input {
file {
path => "path_to_csv_file"
sincedb_path=> "/dev/null"
start_position => beginning
}
}
filter {
csv {
columns => [
"tstp",
"Voltage_A_real",
"Voltage_B_real",
"Voltage_C_real"
]
separator => ","
}
date {
match => [ "tstp", "yyyy-MM-dd HH:mm:ss"]
}
mutate {
convert => ["Voltage_A_real", "float"]
convert => ["Voltage_B_real", "float"]
convert => ["Voltage_C_real", "float"]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "temp_load_index"
}
}
我运行logstash -f conf_file -v
时来自rubydebug的输出是
{
"message" => "2000-02-18 16:00:00,2532.38,-1067,-576.238",
"@version" => "1",
"@timestamp" => "2000-02-18T21:00:00.000Z",
"path" => "path_to_csv",
"host" => "myhost",
"tstp" => "2000-02-18 16:00:00",
"Voltage_A_real" => 2532.38,
"Voltage_B_real" => -1067.0,
"Voltage_C_real" => -576.238
}
但是,当我查看仪表板时,我只看到kibana中的2个事件,并且两个都有当前的日期时间戳,而不是2000年的数据范围。有人可以帮我弄清楚发生了什么吗?
示例kibana对象如下
{
"_index": "temp_load_index",
"_type": "logs",
"_id": "myid",
"_score": null,
"_source": {
"message": "2000-04-02 02:00:00,2528.76,-1068.49,-576.796",
"@version": "1",
"@timestamp": "2016-09-27T05:15:29.753Z",
"path": "path_to_csv",
"host": "myhost",
"tstp": "2000-04-02 02:00:00",
"Voltage_A_real": 2528.76,
"Voltage_B_real": -1068.49,
"Voltage_C_real": -576.796,
"tags": [
"_dateparsefailure"
]
},
"fields": {
"@timestamp": [
1474953329753
]
},
"sort": [
1474953329753
]
}
答案 0 :(得分:0)
当你打开Kibana时,根据@timestamp
字段,它通常只显示过去15分钟内的事件。因此,您需要将时间过滤器设置为适当的时间范围(参见documentation),在您的情况下,使用绝对选项并启动2000-01-01。
或者您可以将解析后的时间戳放在另一个字段中(例如original_tst
),以便保留Logstash添加的@timestamp
。
date {
match => [ "tstp", "yyyy-MM-dd HH:mm:ss"]
target => "original_tst"
}