独立的Kafka JDBC接收器连接器未将数据插入mysql数据库(使用confluent-community-2.12)
我已在centos7上安装confluent-community-2.12并启动了独立的jdbc接收器连接器以使用来自远程kafka集群的记录。我可以通过一个简单的Java使用者使用记录,并查看记录数据,但是,当我启动接收器连接器时,它会加载并连接到远程集群,但只在以下日志INFO处停止
[2019-05-01 11:10:21,619] INFO Initializing writer using SQL dialect: MySqlDatabaseDialect (io.confluent.connect.jdbc.sink.JdbcSinkTask:57)
[2019-05-01 11:10:21,620] INFO WorkerSinkTask{id=sink-mysql-standalone-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:301)
即使启动连接器后,如果我向集群生成任何数据,在接收器连接器上也不会发生任何事情。
我尝试使用以下方法获取连接器状态的结果:
curl localhost:8083/connectors/sink-mysql-standalone/status
结果如下:
{"name":"sink-mysql-standalone","connector":{"state":"RUNNING","worker_id":"10.3.0.40:8083"},"tasks":[{"id":0,"state":"RUNNING","worker_id":"10.3.0.40:8083"}],"type":"sink"}
sink.properties:
name=sink-mysql-standalone
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=my-topic
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
# JDBCSink connector specific configuration
connection.url=jdbc:mysql://10.3.0.37:3306/mydb?zeroDateTimeBehavior=convertToNull&useUnicode=yes&characterEncoding=UTF-8
connection.user=myuser
connection.password=mypassword
insert.mode=upsert
table.name.format = tbl_kaf_${topic}
pk.mode=kafka
pk.fields=__connect_topic,__connect_partition,__connect_offset
fields.whitelist=messageId
auto.create=true
auto.evolve=true
生产者产生以下记录:
Key: id_5dfbdffe-ffbc-4fbf-925c-a14734304fa8, Value: {
"type" : "text",
"messageId" : "ID:activemq-XXXXXX-XXXXXXXXXXXXX-X:XX:1:2:2",
"correlationId" : "",
"destination" : {
"type" : "queue",
"name" : "qToKafka"
},
"replyTo" : null,
"priority" : 0,
"expiration" : 0,
"timestamp" : 1556549819473,
"redelivered" : false,
"properties" : {},
"payloadText" : "<Some XML Data>",
"payloadMap" : null,
"payloadBytes" : null
}
请让我知道我在这里想念什么。 谢谢