通过Logstash将JSON文件导入到Elasticsearch和Kibana(Docker ELK堆栈)

时间:2018-10-24 13:20:29

标签: docker elasticsearch logstash kibana elastic-stack

我正在尝试通过Logstash将存储在JSON文件中的数据导入Elasticsearch / Kibana。我尝试通过搜索来解决问题,但未成功。

我正在使用here [git/docker-elk]提供的Docker与ELK堆栈。


我的 logstash.conf 当前看起来像这样:

input {
        tcp {
            port => 5000
        }

        file {
            path => ["/export.json"]
            codec => "json"
            start_position => "beginning"
        }

    }

    filter {
        json {
            source => "message"
        }
    }

    ## Add your filters / logstash plugins configuration here

    output {
        stdout { 
            id => "stdout_test_id"
            codec => json 
        }

        elasticsearch {
            hosts => "elasticsearch:9200"
            index => "logstash-indexname"
        }
    }

JSON文件的格式如下:

[{fields},{fields},{fields},...]

完整的JSON结构:https://jsoneditoronline.org/?id=3d49813d38e641f6af6bf90e9a6481e3

我想将每个括号下的所有内容按原样导入Elasticsearch。


运行 docker-compose up 后的Shell输出:

logstash_1       | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash_1       | [2018-10-24T13:21:54,602][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
logstash_1       | [2018-10-24T13:21:54,612][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
logstash_1       | [2018-10-24T13:21:54,959][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or commandline options are specified
logstash_1       | [2018-10-24T13:21:55,003][INFO ][logstash.agent   ] No persistent UUID file found. Generating new UUID {:uuid=>"4a572899-c7ac-4b41-bcc0-7889983240b4", :path=>"/usr/share/logstash/data/uuid"}
logstash_1       | [2018-10-24T13:21:55,522][INFO ][logstash.runner   ] Starting Logstash {"logstash.version"=>"6.4.0"}
logstash_1       | [2018-10-24T13:21:57,552][INFO ][logstash.pipeline   ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
logstash_1       | [2018-10-24T13:21:58,018][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
logstash_1       | [2018-10-24T13:21:58,035][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
logstash_1       | [2018-10-24T13:21:58,272][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
logstash_1       | [2018-10-24T13:21:58,377][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
logstash_1       | [2018-10-24T13:21:58,381][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
logstash_1       | [2018-10-24T13:21:58,419][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
logstash_1       | [2018-10-24T13:21:58,478][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
logstash_1       | [2018-10-24T13:21:58,529][INFO ][logstash.inputs.tcp   ] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enable=>"false"}
logstash_1       | [2018-10-24T13:21:58,538][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
logstash_1       | [2018-10-24T13:21:58,683][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
elasticsearch_1  | [2018-10-24T13:21:58,785][WARN ][o.e.d.a.a.i.t.p.PutIndexTemplateRequest] Deprecated field [template] used, replaced by [index_patterns]
elasticsearch_1  | [2018-10-24T13:21:59,036][WARN ][o.e.d.i.m.MapperService  ] [_default_] mapping is deprecated since it is not useful anymore nowthat indexes cannot have more than one type
elasticsearch_1  | [2018-10-24T13:21:59,041][INFO ][o.e.c.m.MetaDataIndexTemplateService] [riEmfTq] adding template [logstash] for index patterns [logstash-*]
logstash_1       | [2018-10-24T13:21:59,158][INFO ][logstash.inputs.file   ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_1ed00aa8bbe3029ead0818433d122587", :path=>["/export.json"]}
logstash_1       | [2018-10-24T13:21:59,210][INFO ][logstash.pipeline   ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4b7995b9 sleep>"}
logstash_1       | [2018-10-24T13:21:59,337][INFO ][logstash.agent   ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
logstash_1       | [2018-10-24T13:21:59,357][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
logstash_1       | [2018-10-24T13:21:59,760][INFO ][logstash.agent   ] Successfully started Logstash API endpoint {:port=>9600}

1 个答案:

答案 0 :(得分:2)

问题在于此文件包含包装在一行中的JSON数组内的所有文档。 Logstash无法轻松读取此类文件。

我建议将文件转换为另一个文件,其中每个JSON文档都位于自己的行上,以便Logstash可以轻松使用它。

首先,运行以下命令(您可能必须先安装jq utility):

 cat export.json | jq -c '.[]' > export_lines.json

然后将您的file输入更改为

 path => ["/export_lines.json"]

重新运行Logstash并享受!