我正在运行带有Kafka-connect和debezium sql服务器连接器的docker容器。
当我启动连接器时,它将获取数据库的快照,并将有关数据库当时的数据的消息发送到kafka。
如果我开始进行更新,则在这些快照和发送给kafka的消息之后,立即在数据库中插入和删除,连接器将完全响应并接收数据库的更新。
如果我让连接器运行时数据库没有任何变化(例如30分钟,1个小时),则该连接器的线程似乎正在休眠,需要一段时间才能接收到新的更新。我做的最后一个测试是,连接器需要20分钟才能唤醒并接收新的更新,但是一旦唤醒,它就会完全响应。
我在堆栈溢出中发现了一个question,类似于debezium mongodb连接器。该问题中描述的解决方案对我不起作用,因为我无法重新启动连接器任务,因为我不知道何时会发生数据库更改,因为它是进行数据库更改的遗留应用程序。
有人知道其他方法来保持连接器任务与数据库更改同步/响应吗?
连接器配置:
{
"name": "myConnector",
"config": {
"connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
"tasks.max": "1",
"database.history.kafka.topic": "asw.myConnector",
"internal.key.converter.schemas.enable": "false",
"table.whitelist": "dbo.user,dbo.test_table1,dbo.test_table2",
"value.converter.basic.auth.credentials.source": "USER_INFO",
"tombstones.on.delete": "false",
"schema.registry.url": "http://xxxx:8081",
"schema.registry.basic.auth.credentials.source": "USER_INFO",
"database.history.kafka.recovery.poll.interval.ms": "20000",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"errors.log.enable": "true",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"database.user": "sa",
"database.dbname": "myDb",
"database.history.kafka.bootstrap.servers": "xxxx:9092",
"database.server.name": "asw",
"database.port": "1433",
"key.converter.basic.auth.user.info": "xxxx",
"value.converter.schema.registry.url": "http://xxxx:8081",
"internal.key.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.basic.auth.user.info": "xxxx",
"database.hostname": "xxxx",
"database.password": "xxxx",
"internal.value.converter.schemas.enable": "false",
"internal.value.converter": "org.apache.kafka.connect.json.JsonConverter",
"schema.registry.basic.auth.user.info": "xxxx",
"key.converter.schema.registry.url": "http://xxxx:8081",
"key.converter.basic.auth.credentials.source": "USER_INFO",
"transforms": "InsertTenantId, InsertInstanceId, ValueToKey",
"transforms.InsertTenantId.type": "org.apache.kafka.connect.transforms.InsertField$Value",
"transforms.InsertTenantId.static.field": "tenant_id",
"transforms.InsertTenantId.static.value": "xxxx",
"transforms.InsertInstanceId.type": "org.apache.kafka.connect.transforms.InsertField$Value",
"transforms.InsertInstanceId.static.field": "instance_id",
"transforms.InsertInstanceId.static.value": "xxxx",
"transforms.ValueToKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
"transforms.ValueToKey.fields":"tenant_id,instance_id"
}
}