logstash错误:注册插件时出错,管道由于错误而中止(<typeerror:can not =“”dup =“”fixnum =“”>)

时间:2018-04-22 07:41:15

标签: logstash logstash-jdbc

我是ELK的初学者并尝试将数据从MySQL加载到elasticsearch(下一步我想通过javarestclient查询它们),所以我使用了logstash-6.2.4和elasticsearch-6.2.4。并在此处跟随example。 当我跑:bin/logstash -f /path/to/my.conf时,我收到了错误:

[2018-04-22T10:15:08,713][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Jdbc jdbc_connection_string=>\"jdbc:mysql://localhost:3306/testdb\", jdbc_user=>\"root\", jdbc_password=><password>, jdbc_driver_library=>\"/usr/local/logstash-6.2.4/config/mysql-connector-java-6.0.6.jar\", jdbc_driver_class=>\"com.mysql.jdbc.Driver\", statement=>\"SELECT * FROM testtable\", id=>\"7ff303d15d8fc2537248f48fae5f3925bca7649bbafc30d2cd52394ea9961797\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_f8d44c47-8421-4bb9-a6b9-0b34e0aceb13\", enable_metric=>true, charset=>\"UTF-8\">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>\"info\", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, last_run_metadata_path=>\"/Users/chu/.logstash_jdbc_last_run\", use_column_value=>false, tracking_column_type=>\"numeric\", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>", :error=>"can't dup Fixnum", :thread=>"#<Thread:0x3fae16e2 run>"}
[2018-04-22T10:15:09,256][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<TypeError: can't dup Fixnum>, :backtrace=>["org/jruby/RubyKernel.java:1882:in `dup'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date/format.rb:838:in `_parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date.rb:1830:in `parse'", "/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:87:in `set_value'", "/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:36:in `initialize'", "/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:29:in `build_last_value_tracker'", "/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:216:in `register'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:342:in `register_plugin'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:in `register_plugins'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:500:in `start_inputs'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:394:in `start_workers'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:290:in `run'", "/usr/local/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:250:in `block in start'"], :thread=>"#<Thread:0x3fae16e2 run>"}
[2018-04-22T10:15:09,314][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

这里是testdbinit.conf(utf-8编码):

input {
  jdbc { 
    jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb"
    jdbc_user => "root"
    jdbc_password => "mypassword"
    jdbc_driver_library => "/usr/local/logstash-6.2.4/config/mysql-connector-java-6.0.6.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    statement => "SELECT * FROM testtable"
    }
  }
output {
  stdout { codec => json_lines }
  elasticsearch {
  "hosts" => "localhost:9200"
  "index" => "testdemo"
   document_id => "%{personid}"
  "document_type" => "person"
  }
}

这是表(database:testdb ---&gt; table:testtable):

mysql> select * from testtable;
+----------+----------+-----------+-----------+-------+
| PersonID | LastName | FirstName | City      | flag  |
+----------+----------+-----------+-----------+-------+
|     1003 | McWell   | Sharon    | Cape Town | exist |
|     1002 | Baron    | Richard   | Cape Town | exist |
|     1001 | Kallis   | Jaques    | Cape Town | exist |
|     1004 | Zhaosi   | Nicholas  | Iron Hill | exist |
+----------+----------+-----------+-----------+-------+

我尝试谷歌这个问题,但仍然没有线索;我想也许某些类型的转换错误(TypeError:不能复制Fixnum)导致这个问题,但是这个“dup Fixnum”究竟是什么,如何解决呢?
并且 还有一件事让我感到困惑 :我昨天运行相同的代码,并将加载的数据成功加入elasticsearch,我也可以通过localhost:9200搜索它们,但是第二天早上当我尝试同样的事情(在同一个cpmputer上)时,我遇到了这些问题。我整整扔了一整天,请帮我一些提示。

1 个答案:

答案 0 :(得分:2)

我也在logstash community问了同样的问题,在他们的帮助下,我想我找到了解决问题的方法:
异常跟踪exception=>#<TypeError: can't dup Fixnum>表示存在类型转换错误。 sql_last_value,对于数值初始化为0,对于datetime值初始化为1970-01-01。我认为存储在last_run_metadata_path中的sql_last_value不是数字或日期时间值,因此我在conf文件中添加clean_run => true并再次运行logstash,不再发生错误。添加clean_run => true后,错误的sql_last_value值重置为0或1970-01-01,线程继续运行,数据成功编入索引。