在jdbc连接器中使用增量摄取的问题

时间:2019-04-29 01:22:04

标签: mysql jdbc apache-kafka apache-kafka-connect confluent

我正在尝试使用增量摄取来生成有关mysql中表更新的主题的消息。它使用时间戳工作,但似乎无法使用递增列模式。当我在表中插入新行时,看不到任何发布到该主题的消息。

{
            "_comment": " --- JDBC-specific configuration below here  --- ",
            "_comment": "JDBC connection URL. This will vary by RDBMS. Consult your manufacturer's handbook for more information",
            "connection.url": "jdbc:mysql://localhost:3306/lte?user=root&password=tiger",

            "_comment": "Which table(s) to include",
            "table.whitelist": "candidate_score",

            "_comment": "Pull all rows based on an timestamp column. You can also do bulk or incrementing column-based extracts. For more information, see http://docs.confluent.io/current/connect/connect-jdbc/docs/source_config_options.html#mode",
            "mode": "incrementing",

            "_comment": "Which column has the timestamp value to use?  ",
            "incrementing.column.name": "attempt_id",

            "_comment": "If the column is not defined as NOT NULL, tell the connector to ignore this  ",
            "validate.non.null": "true",

            "_comment": "The Kafka topic will be made up of this prefix, plus the table name  ",
            "topic.prefix": "mysql-"
    }

attempt_id是一个自动递增的非null列,它也是主键。

1 个答案:

答案 0 :(得分:0)

实际上,这是我的错。我听错了话题。