Kafka Connect JDBC Sink-一个接收器配置中每个主题(表)的pk.fields

时间:2019-01-30 10:43:51

标签: jdbc apache-kafka apache-kafka-connect confluent debezium

对于此示例debezium-example

我有多个具有不同主键的主题

item (pk : id)
itemDetail (pk :id, itemId)
itemLocation (pk :id, itemId)

jdbc-sink.source

{
"name": "jdbc-sink",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "item,itemDetail,itemLocation",
    "connection.url": "jdbc:postgresql://postgres:5432/inventory?user=postgresuser&password=postgrespw",
    "transforms": "unwrap",
    "transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope",
    "auto.create": "true",
    "insert.mode": "upsert",
    "pk.fields": "id",
    "pk.mode": "record_value"
}
}

我们如何为每个主题(表)指定“ pk.fields”?

1 个答案:

答案 0 :(得分:1)

我认为每个主题的PK映射都没有这样的配置。

您将要为每个主题进行多个配置

{
"name": "jdbc-sink-item",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "item",
    "pk.fields": "id",

{
"name": "jdbc-sink-itemDetail",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "itemDetail",
    "pk.fields": "id,itemId",

依此类推