我尝试使用带有连接器kafka-connect-cdc-mssql:1.0.0-preview的Kafka Connector读取日志事务
我正在使用Confluent CLI,并且已经修改了配置文件:
etc/schema-registry/connect-avro-standalone.properties
我添加:
plugin.path=/usr/share/java,/tmp/Softwares/confluent-hub-client-latest/share/confluent-hub-components/confluentinc-kafka-connect-cdc-mssql/lib
我有这个问题:当我运行命令时 $汇合负载-d
$ confluent load mssqlCDC -d /tmp/Softwares/confluent-hub-client-latest/share/confluent-hub-components/confluentinc-kafka-connect-cdc-mssql/etc/mssqlsource.properties
我收到此错误消息:
(23) Failed writing body
我的存档mssqlsource.properties:
name=mssqlsource
tasks.max=2
connector.class=io.confluent.connect.cdc.mssql.MsSqlSourceConnector
initial.database=(MYDATABASE)
server.name=(MYSERVER)
server.port=(PORT)
username=(MYUSER)
password=(MYPASS)
change.tracking.tables=(MYTABLE)
答案 0 :(得分:0)
confluent load
命令实际上是在后台运行的curl
命令,可能更容易自己运行并得到正确的错误,将属性文件转换为mssqlsource.json
文件:
{
"name": "mssqlsource",
"config": {
"tasks.max": 2,
"connector.class": "io.confluent.connect.cdc.mssql.MsSqlSourceConnector",
"initial.database": "(MYDATABASE)",
"server.name": "(MYSERVER)",
"server.port": "(PORT)",
"username": "(MYUSER)",
"password": "(MYPASS)",
"change.tracking.tables": "(MYTABLE)"
}
}
然后运行curl -XPOST -H "Content-Type: application/json" --data @/path/to/mssqlsource.json connect-host:8083/connectors
这将为您提供更好的错误消息,并且更接近使用Kafka Connect的实际示例,因为confluent
命令行用于本地开发环境。