我正在尝试按照以下链接中的指南进行操作:
http://www.viaboxx.de/code/easily-generate-live-heatmaps-for-geolocations-with-elk/#codesyntax_1
第一次它对我来说很好,但是当我现在尝试它时,它在我尝试加载csv数据的步骤中给出了以下错误。我执行的命令是:
cat test.csv | /opt/logstash/bin/logstash -f geostore.conf
我收到以下错误:
Settings: Default pipeline workers: 2
Pipeline main started
Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}
你能帮忙!!!我花了几天时间试图找出答案。
编辑添加geostore.conf:
input { stdin {} }
filter { # Step 1, drop the csv header line
if [message] =~ /^#/ {
drop {}
} # Step 2, split latitude and longitude
csv {
separator => ','
columns => [ 'lat', 'lon' ] }
# Step 3 # move lat and lon into location object # for defined geo_point type in ES
mutate {
rename => [ "lat", "[location][lat]", "lon", "[location][lon]" ]
}
}
output {
elasticsearch {
hosts => 'localhost'
index => 'geostore'
document_type => "locality"
flush_size => 1000
}
}
我已经改变了输出部分:
output {
elasticsearch {
hosts => 'localhost'
index => 'geostore'
document_type => "locality"
flush_size => 1000
}
到这个
output {
elasticsearch {
hosts => 'localhost'
index => 'geostore'
document_type => "locality"
flush_size => 1000
stdout {}
}
现在我收到了更详细的错误消息:
fetched an invalid config {:config=>"input {\n stdin {}\n}\nfilter {\n #
Step 1, drop the csv header line\n if [message] =~ /^#/ {\n drop {}\n }\n
\n # Step 2, split latitude and longitude\n csv {\n separator => ','\n
columns => [ 'lat', 'lon' ]\n }\n \n # Step 3\n # move lat and lon into
location object \n # for defined geo_point type in ES\n mutate { \n rename
=> [ \"lat\", \"[location][lat]\", \"lon\", \"[location][lon]\" ]\n
}\n}\noutput {\n elasticsearch {\n hosts => 'localhost'\n index =>
'geostore'\n document_type => \"locality\"\n flush_size => 1000\n
stdout {}\n }\n}\n\n", :reason=>"Expected one of #, => at line 29, column 12
(byte 543) after output {\n elasticsearch {\n hosts => 'localhost'\n
index => 'geostore'\n document_type => \"locality\"\n flush_size =>
1000\n stdout ", :level=>:error}
无法理解为什么它第一次起作用。
Settings: Default pipeline workers: 2
Pipeline main started
Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
2017-03-30T13:46:31.171Z localhost.localdomain 53.97917361, -6.389038611
2017-03-30T13:46:31.171Z localhost.localdomain 54.00310028, -6.397707778
2017-03-30T13:46:31.172Z localhost.localdomain 53.99960056, -6.381966111
2017-03-30T13:46:31.172Z localhost.localdomain 54.00534917, -6.423718889
2017-03-30T13:46:31.172Z localhost.localdomain 51.92071667, -8.475726111
2017-03-30T13:46:31.172Z localhost.localdomain 51.82731222, -8.381912222
2017-03-30T13:46:31.173Z localhost.localdomain 51.81096639, -8.415731667
2017-03-30T13:46:31.173Z localhost.localdomain 54.28450222, -8.463775556
2017-03-30T13:46:31.173Z localhost.localdomain 54.27841, -8.495700278
2017-03-30T13:46:31.173Z localhost.localdomain 54.2681225, -8.462056944
2017-03-30T13:46:31.174Z localhost.localdomain 52.276167, -9.680497
2017-03-30T13:46:31.174Z localhost.localdomain 52.25660139, -9.703921389
2017-03-30T13:46:31.174Z localhost.localdomain 52.27031306, -9.723975556
2017-03-30T13:46:31.174Z localhost.localdomain 54.95663111, -7.714384167
2017-03-30T13:46:31.175Z localhost.localdomain 54.00133111, -7.352790833
2017-03-30T13:46:31.175Z localhost.localdomain 52.34264222, -6.4854175
2017-03-30T13:46:31.176Z localhost.localdomain 52.32439028, -6.464626111
2017-03-30T13:46:31.176Z localhost.localdomain 52.33008944, -6.487005
2017-03-30T13:46:31.176Z localhost.localdomain 53.70765861, -6.374657778
2017-03-30T13:46:31.177Z localhost.localdomain 53.72636306, -6.326768611
2017-03-30T13:46:31.177Z localhost.localdomain 53.71461361, -6.336066111
2017-03-30T13:46:31.177Z localhost.localdomain 51.55948417, -9.244535833
2017-03-30T13:46:31.177Z localhost.localdomain 53.52894667, -7.358543056
2017-03-30T13:46:31.177Z localhost.localdomain 53.51801167, -7.324215
2017-03-30T13:46:31.179Z localhost.localdomain 53.16202278, -6.795522222
2017-03-30T13:46:31.179Z localhost.localdomain 53.182702, -6.819299
2017-03-30T13:46:31.179Z localhost.localdomain 52.83053972, -8.991989444
2017-03-30T13:46:31.180Z localhost.localdomain 52.85651944, -8.965725833
2017-03-30T13:46:31.180Z localhost.localdomain 53.02885028, -7.300381667
2017-03-30T13:46:31.180Z localhost.localdomain
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}
答案 0 :(得分:0)
Hopefull,这也有助于其他人。
我从命令行删除了模式:
curl -XDELETE 'localhost:9200/geostore?pretty';
然后去kibana从那里删除它。如下重新加载模式,它工作。
curl -XPUT 'http://localhost:9200/geostore'
curl -XPUT 'http://localhost:9200/geostore/_mapping/locality' -d '
{
"locality" : {
"properties" : {
"location" : {
"type" : "geo_point",
"geohash_prefix": true,
"geohash_precision": "1km"
}
}
}
}'
cat test.csv | /opt/logstash/bin/logstash -f geostore.conf
启动logstash需要几秒钟,解析输入并将结果存储到Elasticsearch中。
现在我们有了Elasticsearch中的数据,让我们转到Kibana 4.登录Kibana后,你需要将索引添加到Kibana。
转到:设置 - &gt;指数 - &gt;添加新 - &gt;在索引名称字段中写下“geostore”。
添加索引后,您将看到索引文档中的所有字段,尤其是您应该检查属性位置是否归类为geo_point。
以下链接详细介绍了整个过程。
http://www.viaboxx.de/code/easily-generate-live-heatmaps-for-geolocations-with-elk/#codesyntax_1