并非所有来自logstash的数据都被elasticsearch索引

时间:2017-03-06 07:02:14

标签: mysql elasticsearch logstash kibana elastic-stack


我是ELK堆栈的新手,我有LogStash将数据从MySQL发送到ElasticSearch,在终端上,它看起来已经发送了所有40,000条记录但是当我去看Kibana时,我看到只有200条记录已输入。
这是我使用的LogStash配置文件。

# file: simple-out.conf
input {
    jdbc {
        # Postgres jdbc connection string to our database, mydb
        jdbc_connection_string => "jdbc:mysql://localhost:3306/tweets_articles"
        # The user we wish to execute our statement as
        jdbc_user => "root"
        # The path to our downloaded jdbc driver
        jdbc_driver_library => "/etc/elasticsearch/elasticsearch-jdbc-2.3.3.1/lib/mysql-connector-java-5.1.38.jar"
        # The name of the driver class for Postgresql
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        jdbc_user => "**"
        jdbc_password => "***"
        # our query
        statement => "SELECT * from tweets"
    }
}
output {
        elasticsearch { hosts => ["localhost:9200"] }
        stdout { codec => rubydebug }
}

这会是日期的问题吗?在MySQL中,当我以这种格式打印记录的时间时。

+---------------------+
| PUBLISHED_AT        |
+---------------------+
| 2017-03-06 03:43:51 |
| 2017-03-06 03:43:45 |
| 2017-03-06 03:43:42 |
| 2017-03-06 03:43:30 |
| 2017-03-06 03:43:00 |
+---------------------+
5 rows in set (0.00 sec)

但是当我在终端中看到配置的输出时,它看起来像这样。

             "id" => 41298,
         "author" => "b'Terk'",
  "retweet_count" => "0",
 "favorite_count" => "0",
"followers_count" => "49",
  "friends_count" => "23",
           "body" => "create an ad",
   "published_at" => "2017-03-06T07:30:47.000Z",
       "@version" => "1",
     "@timestamp" => "2017-03-06T06:44:04.756Z"

其他人可以看到为什么我无法获得所有40,000条记录吗? 感谢。

1 个答案:

答案 0 :(得分:0)

我发现这是答案。 Kibana doesn't show any results in "Discover" tab

您可以在此处查看点击的位置。 enter image description here