Logstash-得到错误,因为发生未知错误,将大量请求发送到Elasticsearch

时间:2018-09-19 08:05:59

标签: elasticsearch logstash logstash-jdbc

我正在尝试通过Logstash将SQL Server表记录移至elasticsearch。它基本上是同步的。但是我从LogStash得到一个错误,即未知错误。我提供了我的配置文件以及错误日志。

配置:

input {
  jdbc {
    #https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-record_last_run
    jdbc_connection_string => "jdbc:sqlserver://localhost-serverdb;database=Application;user=dev;password=system23$"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" 
    jdbc_user => nil
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "C:\Program Files (x86)\sqljdbc6.2\enu\sqljdbc4-3.0.jar"
    # The name of the driver class for SqlServer
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"    

    #executes every minutes.
    schedule => "* * * * *"
    #executes 0th minute of every day, basically every hour.
    #schedule => "0 * * * *"

    last_run_metadata_path => "C:\Software\ElasticSearch\logstash-6.4.0\.logstash_jdbc_last_run"
    #record_last_run => false
    #clean_run => true

    # Query for testing purpose 
    statement => "Select * from tbl_UserDetails"    
  }
}

output {
  elasticsearch {
    hosts => ["10.187.144.113:9200"]
    index => "tbl_UserDetails"
    #document_id is a unique id, this has to be provided during syn, else we may get duplicate entry in ElasticSearch index.
    document_id => "%{Login_User_Id}"
  }
}

错误日志:

[2018-09-18T21:04:32,171][ERROR][logstash.outputs.elasticsearch] 
An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {
:error_message=>"\"\\xF0\" from ASCII-8BIT to UTF-8", 
:error_class=>"LogStash::Json::GeneratorError", 
:backtrace=>["C:/Software/ElasticSearch/logstash-6.4.0/log
stash-core/lib/logstash/json.rb:27:in `jruby_dump'", 
"C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'"
, "org/jruby/RubyArray.java:2486:in `map'", 
"C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in `bulk'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9
.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:275:in `safe_bulk'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:180:in `submit'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0
/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:148:in `retrying_submit'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/log
stash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:114:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:97:in `multi_receive'", "C:/Soft
ware/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:372:in`block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:371:in `output_batch'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:323:in `worker_loop'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:285:in `block in start_workers'"]}

[2018-09-18T21:05:00,140][INFO ][logstash.inputs.jdbc     ] (0.008273s) Select *
 from tbl_UserDetails

Logstash版本:6.4.0 Elasticsearch版本:6.3.1

谢谢。

2 个答案:

答案 0 :(得分:1)

您的数据库中有一个字符'\ xF0',导致此问题。此'\ xF0'字符可能是多字节字符的第一个字节。但是由于ruby在这里尝试使用ASCII-8BIT进行解码,因此将每个字节都视为字符。

您可以尝试使用column_charset设置适当的字符集。 https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-columns_charset

答案 1 :(得分:-1)

以上问题已解决。

感谢您的支持人员。

我所做的更改是在输入-> jdbc中添加了以下两个属性

input {
 jdbc  {         
    tracking_column => "login_user_id"
    use_column_value => true
    }
  }

在output-> elasticsearch下,我更改了两个属性

 output {
     elasticsearch {     
    document_id => "%{login_user_id}"
    document_type => "user_details"
    }
  }

从这里获得的主要收获是所有值均应在小写中提及。