Unable to index data into Elasticsearch using Logstash

时间:2019-01-07 14:03:52

标签: mongodb elasticsearch indexing import logstash

I am trying to import data from a .json file into Elasticsearch using Logstash. Just for the purpose of testing, I have created a dummy exportedJson.json file and placed it inside the bin folder of Logstash. Here is how it looks:

{"_id":{"$oid":"5c2486b191dea3d16f74aa0f"},"Application":"My InCar","Category":"Service","Message":"HardwareNotConnectedOrInitialized","IRSAClientID":1,"Station":"My-Station","StationId":1,"ExceptionTime":"2018-05-16 13:20:17.000Z","LogTime":"2018-05-16 14:19:35.000Z"} {"_id":{"$oid":"5c248eb3c2158801495d32f0"},"Application":"My InCar","Category":"Service","Message":"ServiceFailureOnRestart","IRSAClientID":1,"Station":"My-Station","StationId":1,"ExceptionTime":"2018-05-16 15:30:17.000Z","LogTime":"2018-05-16 16:40:35.000Z"}

Here is my logstash file input importToElastic.conf:

input {
    file {
        path => ["E:/Elasticsearch/logstash-6.5.4/bin/exportedJson.json"]
        start_position => "beginning"
        sincedb_path => "NUL"
        codec => json
    }
}
filter {
    mutate {
        rename => { "_id" => "mongo_id" }
    }
    date {
        match => ["ExceptionTime", "yyyy-MM-dd HH:mm:ss.SSSZ"]
        timezone => "UTC"
        target => "exceptionTime"
    }
    date {
        match => ["LogTime", "yyyy-MM-dd HH:mm:ss.SSSZ"]
        timezone => "UTC"
        target => "logTime"
    }

}
output {
    elasticsearch {
        hosts => ["127.0.0.1:9200"]
        index => "logexception"
        doc_as_upsert => true
        document_id => "%{mongo_id}"
    }
}

Now when I run

logstash -f importToElastic.conf

Following log shows up on console.

E:\Elasticsearch\logstash-6.5.4\bin>logstash -f importToElastic.conf
Sending Logstash logs to E:/Elasticsearch/logstash-6.5.4/logs which is now configured via log4j2.properties
[2019-01-07T18:48:17,639][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-01-07T18:48:17,656][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.4"}
[2019-01-07T18:48:21,017][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-01-07T18:48:21,470][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2019-01-07T18:48:21,708][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2019-01-07T18:48:21,771][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-01-07T18:48:21,775][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-01-07T18:48:21,802][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2019-01-07T18:48:21,825][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-01-07T18:48:21,850][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-01-07T18:48:22,263][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3f21ab01 sleep>"}
[2019-01-07T18:48:22,316][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-07T18:48:22,329][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-07T18:48:22,677][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

But json data is not indexed to Elasticsearch. Where am I wrong? Can someone please guide me. I am new to Elasticsearch.

0 个答案:

没有答案