使用logstash将CSV上传到Elastic搜索

时间:2017-08-13 08:38:07

标签: elasticsearch logstash

我尝试使用Logstash将简单的CSV文件上传到Elastic。 我的文件看起来像这样:

User_Id,Age,Gender,Occupation,Zip_Code
1,24,M,technician,85711
2,53,F,other,94043
3,23,M,writer,32067
4,24,M,technician,43537

我的配置文件如下所示:

input{
    file{
        path=>"/Users/office/Desktop/Elasticsearch data/ufo.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
}
}   
filter{
csv{
separator => ","
columns =>["User_Id","Age","Gender","Occupation","Zip_Code"]

}
#mutate{convert => ["User_Id","interger"]}
#mutate{convert => ["Age","interger"]}
}
output{
elasticsearch{
hosts=>"http://localhost:9200"
index=>"ufo"
document_type => "found"
}
stdout{}
}

当我运行Logstash时,文件不会导入到Elastic

我从Logstash获得了这个引用:

Sending Logstash's logs to C:/logstash/logs which is now configured via log4j2.properties
[2017-08-13T11:34:25,717][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-08-13T11:34:25,733][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-08-13T11:34:25,889][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x67fa62de URL:http://localhost:9200/>}
[2017-08-13T11:34:25,889][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-13T11:34:25,951][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-13T11:34:25,967][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::HTTP:0x23015a5 URL:http://localhost:9200>]}
[2017-08-13T11:34:25,967][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-13T11:34:26,544][INFO ][logstash.pipeline        ] Pipeline main started
[2017-08-13T11:34:26,763][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

有什么问题? 我该怎么办?

非常感谢 塔尔

我用bug模式运行了这个:

C:\logstash\bin>.\logstash --debug -f /Users/office/Desktop/Elasticsearch_data/LogSt_UFO.config
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to C:/logstash/logs which is now configured via log4j2.properties
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] node.name: "ILD05247"
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] *path.config: "/Users/office/Desktop/Elasticsearch_data/LogSt_UFO.config"
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] path.data: "C:/logstash/data"
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] config.test_and_exit: false
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] config.reload.automatic: false
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] config.reload.interval: 3
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] metric.collect: true
[2017-08-14T14:50:22,747][DEBUG][logstash.runner          ] pipeline.id: "main"
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.system: false
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.workers: 4
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.output.workers: 1
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.batch.delay: 5
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] path.plugins: []
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] config.debug: false
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] version: false
[2017-08-14T14:50:22,762][DEBUG][logstash.runner          ] help: false
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] log.format: "plain"
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] http.port: 9600..9700
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] http.environment: "production"
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] queue.type: "memory"
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] queue.drain: false
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] queue.page_capacity: 262144000
[2017-08-14T14:50:22,778][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] queue.max_events: 0
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2017-08-14T14:50:22,794][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2017-08-14T14:50:22,809][DEBUG][logstash.runner          ] path.queue: "C:/logstash/data/queue"
[2017-08-14T14:50:22,809][DEBUG][logstash.runner          ] path.settings: "C:/logstash/config"
[2017-08-14T14:50:22,809][DEBUG][logstash.runner          ] path.logs: "C:/logstash/logs"
[2017-08-14T14:50:22,809][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2017-08-14T14:50:22,840][DEBUG][logstash.agent           ] Agent: Configuring metric collection
[2017-08-14T14:50:22,840][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-08-14T14:50:22,872][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-08-14T14:50:22,918][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-08-14T14:50:22,918][DEBUG][logstash.agent           ] Reading config file {:config_file=>"C:/Users/office/Desktop/Elasticsearch_data/LogSt_UFO.config"}
[2017-08-14T14:50:23,012][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File}
[2017-08-14T14:50:23,028][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2017-08-14T14:50:23,044][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_72070627-b8cf-4e5f-b0f1-e31da0f1b0ed"
[2017-08-14T14:50:23,044][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2017-08-14T14:50:23,044][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_path = "/dev/null"
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@id = "b4c6367a76926e5cb94c573a455d73551af49844-1"
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@enable_metric = true
[2017-08-14T14:50:23,044][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain id=>"plain_72070627-b8cf-4e5f-b0f1-e31da0f1b0ed", enable_metric=>true, charset=>"UTF-8">
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@add_field = {}
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@stat_interval = 1
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@discover_interval = 15
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_write_interval = 15
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@delimiter = "\n"
[2017-08-14T14:50:23,059][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@close_older = 3600
[2017-08-14T14:50:23,122][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"csv", :type=>"filter", :class=>LogStash::Filters::CSV}
[2017-08-14T14:50:23,145][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@separator = ","
[2017-08-14T14:50:23,146][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@columns = ["User_Id", "Age", "Gender", "Occupation", "Zip_Code"]
[2017-08-14T14:50:23,147][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@id = "b4c6367a76926e5cb94c573a455d73551af49844-2"
[2017-08-14T14:50:23,148][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@enable_metric = true
[2017-08-14T14:50:23,150][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_tag = []
[2017-08-14T14:50:23,151][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_tag = []
[2017-08-14T14:50:23,152][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_field = {}
[2017-08-14T14:50:23,154][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_field = []
[2017-08-14T14:50:23,155][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@periodic_flush = false
[2017-08-14T14:50:23,156][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@source = "message"
[2017-08-14T14:50:23,159][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@quote_char = "\""
[2017-08-14T14:50:23,162][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autogenerate_column_names = true
[2017-08-14T14:50:23,163][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_empty_columns = false
[2017-08-14T14:50:23,165][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@convert = {}
[2017-08-14T14:50:23,705][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2017-08-14T14:50:23,705][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_bd937c2a-ffaa-4295-8294-cf59c415ff1f"
[2017-08-14T14:50:23,705][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2017-08-14T14:50:23,705][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [http://localhost:9200]
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "ufo"
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@document_type = "found"
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "b4c6367a76926e5cb94c573a455d73551af49844-3"
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_bd937c2a-ffaa-4295-8294-cf59c415ff1f", enable_metric=>true, charset=>"UTF-8">
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2017-08-14T14:50:23,721][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2017-08-14T14:50:23,737][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2017-08-14T14:50:23,752][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2017-08-14T14:50:23,768][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2017-08-14T14:50:23,768][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2017-08-14T14:50:23,768][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2017-08-14T14:50:23,768][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2017-08-14T14:50:23,768][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2017-08-14T14:50:23,779][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2017-08-14T14:50:23,780][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2017-08-14T14:50:23,781][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2017-08-14T14:50:23,797][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2017-08-14T14:50:23,826][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"line", :type=>"codec", :class=>LogStash::Codecs::Line}
[2017-08-14T14:50:23,831][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@id = "line_7326a4ae-0304-4339-b4b5-22989cefa3bc"
[2017-08-14T14:50:23,832][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@enable_metric = true
[2017-08-14T14:50:23,833][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@charset = "UTF-8"
[2017-08-14T14:50:23,836][DEBUG][logstash.codecs.line     ] config LogStash::Codecs::Line/@delimiter = "\n"
[2017-08-14T14:50:23,857][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "b4c6367a76926e5cb94c573a455d73551af49844-4"
[2017-08-14T14:50:23,859][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2017-08-14T14:50:23,862][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::Line id=>"line_7326a4ae-0304-4339-b4b5-22989cefa3bc", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2017-08-14T14:50:23,863][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2017-08-14T14:50:23,872][DEBUG][logstash.agent           ] starting agent
[2017-08-14T14:50:23,877][DEBUG][logstash.agent           ] starting pipeline {:id=>"main"}
[2017-08-14T14:50:23,885][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2017-08-14T14:50:24,213][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-08-14T14:50:24,213][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-08-14T14:50:24,401][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x3d0cad84 URL:http://localhost:9200/>}
[2017-08-14T14:50:24,401][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-14T14:50:24,463][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-14T14:50:24,479][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-08-14T14:50:24,479][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::HTTP:0x2f0909f2 URL:http://localhost:9200>]}
[2017-08-14T14:50:24,494][DEBUG][logstash.filters.csv     ] CSV parsing options {:col_sep=>",", :quote_char=>"\""}
[2017-08-14T14:50:24,494][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-14T14:50:25,026][INFO ][logstash.pipeline        ] Pipeline main started
[2017-08-14T14:50:25,041][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:50:25,104][DEBUG][logstash.agent           ] Starting puma
[2017-08-14T14:50:25,104][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2017-08-14T14:50:25,119][DEBUG][logstash.api.service     ] [api-service] start
[2017-08-14T14:50:25,276][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-08-14T14:50:30,089][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:50:35,089][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:50:39,109][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:50:40,091][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:50:45,091][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:50:50,105][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:50:54,192][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:50:55,112][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:01,322][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:06,472][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:11,379][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:51:11,473][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:16,479][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:21,479][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:26,452][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:51:26,499][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:31,499][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:36,509][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:41,520][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:41,536][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:51:46,540][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:51,541][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:56,553][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:51:56,568][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:52:01,553][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:06,563][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:11,577][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:11,655][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:52:16,578][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:21,592][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:26,592][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:26,780][DEBUG][logstash.inputs.file     ] _globbed_files: /Users/office/Desktop/Elasticsearch_data/ufo.csv: glob is: ["/Users/office/Desktop/Elasticsearch_data/ufo.csv"]
[2017-08-14T14:52:31,593][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:36,593][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-08-14T14:52:41,608][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline

看起来Pushing上的进程堆栈永远冲到管道上.....

0 个答案:

没有答案