我正在设计MySQL-> Debezium-> Kafka-> Flink-> Kafka-> Kafka Connect JDBC-> MySQL。以下是我从Flink写的示例消息(我也尝试使用Kafka控制台生产者)
{
"schema": {
"type": "struct",
"fields": [
{
"type": "int64",
"optional": false,
"field": "id"
},
{
"type": "string",
"optional": true,
"field": "name"
}
],
"optional": true,
"name": "user"
},
"payload": {
"id": 1,
"name": "Smith"
}
}
但是在JsonConverter上连接失败
DataException: JsonConverter with schemas.enable requires "schema" and "payload" fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.
at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:338)
我已调试,在方法public SchemaAndValue toConnectData(String topic, byte[] value)
中,值为null。我的接收器配置为:
{
"name": "user-sink",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "user",
"connection.url": "jdbc:mysql://localhost:3306/my_db?verifyServerCertificate=false",
"connection.user": "root",
"connection.password": "root",
"auto.create": "true",
"insert.mode": "upsert",
"pk.fields": "id",
"pk.mode": "record_value"
}
}
有人可以帮我解决这个问题吗?
答案 0 :(得分:2)
我认为问题与(Kafka消息)的值序列化无关。消息的密钥相当有问题。
您的key.converter
是什么?我认为它与value.converter
(org.apache.kafka.connect.json.JsonConverter
)一样。您的密钥可能很简单String
,其中不包含schema
,payload
尝试将key.converter
更改为org.apache.kafka.connect.storage.StringConverter
对于Kafka Connect,您设置了默认值Converters
,但是您也可以为特定的Connector配置设置特定的值(它将覆盖默认值)。为此,您必须修改配置请求:
{
"name": "user-sink",
"config": {
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "user",
"connection.url": "jdbc:mysql://localhost:3306/my_db?verifyServerCertificate=false",
"connection.user": "root",
"connection.password": "root",
"auto.create": "true",
"insert.mode": "upsert",
"pk.fields": "id",
"pk.mode": "record_value"
}
}