如何配置Debezium Mysql连接器以生成原始密钥而不是struct或json对象?

时间:2019-03-09 19:43:57

标签: mysql apache-kafka-connect debezium

我正在使用Debezium检测MySql源表中的更改。如何产生Kafka消息,使键为数字(Long)值而不是Json对象?

我得到的是什么

key: {"foo_id": 123} 
value: {"foo_id": 123, "bar": "blahblah", "baz": "meh......"}

我想要什么:

key: 123
value: {"foo_id": 123, "bar": "blahblah", "baz": "meh......"}

我的FOO表如下:

foo_id: INT
bar: VARCHAR 
baz: VARCHAR

请注意,我没有使用avro,并且已经尝试了以下几种组合方式(不带和不带键转换器),但未能获得Long键。

"transforms": "unwrap,insertKey,extractKey",
"transforms.unwrap.type":"io.debezium.transforms.UnwrapFromEnvelope",
"transforms.unwrap.drop.tombstones":"false",
"transforms.insertKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
"transforms.insertKey.fields":"foo_id",
"transforms.extractKey.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
"transforms.extractKey.field":"foo_id",        
"key.converter" : "org.apache.kafka.connect.converters.LongConverter",
"key.converter.schemas.enable": "false", 
"value.converter" : "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false"

我不确定ValueToKey或ExtractField是否适用于(MySQL)源,但是我低于NPE。

Caused by: java.lang.NullPointerException
        at org.apache.kafka.connect.transforms.ValueToKey.applyWithSchema(ValueToKey.java:85)
        at org.apache.kafka.connect.transforms.ValueToKey.apply(ValueToKey.java:65)
        at org.apache.kafka.connect.runtime.TransformationChain.lambda$apply$0(TransformationChain.java:44)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) 

1 个答案:

答案 0 :(得分:0)

基于此https://issues.jboss.org/browse/DBZ-689

找到了解决方案
{
...
    "config": {
    "transforms": "unwrap,insertKey,extractKey",
    "transforms.unwrap.type":"io.debezium.transforms.UnwrapFromEnvelope",
    "transforms.unwrap.drop.tombstones":"false",
    "transforms.insertKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
    "transforms.insertKey.fields":"foo_id",
    "transforms.extractKey.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
    "transforms.extractKey.field":"foo_id",        
    "key.converter" : "org.apache.kafka.connect.converters.IntegerConverter",
    "key.converter.schemas.enable": "true", 
    "value.converter" : "org.apache.kafka.connect.json.JsonConverter",
    "value.converter.schemas.enable": "false",
    "include.schema.changes": "false"  <-- this was missing
    }
}

现在,我将foo_id视为Integer(不是很大,它不是Long):)