错误:使用kafka-connect-cdc-mssql:1.0.0-preview

时间:2018-07-24 15:05:38

标签: sql-server apache-kafka apache-kafka-connect confluent

我尝试使用带有连接器kafka-connect-cdc-mssql:1.0.0-preview的Kafka Connector读取日志事务

我正在使用Confluent CLI,并且已经修改了配置文件:

etc/schema-registry/connect-avro-standalone.properties

我添加:

plugin.path=/usr/share/java,/tmp/Softwares/confluent-hub-client-latest/share/confluent-hub-components/confluentinc-kafka-connect-cdc-mssql/lib

我有这个问题:当我运行命令时 $汇合负载-d

$ confluent load mssqlCDC -d /tmp/Softwares/confluent-hub-client-latest/share/confluent-hub-components/confluentinc-kafka-connect-cdc-mssql/etc/mssqlsource.properties

我收到此错误消息:

(23) Failed writing body

我的存档mssqlsource.properties:

name=mssqlsource
tasks.max=2
connector.class=io.confluent.connect.cdc.mssql.MsSqlSourceConnector
initial.database=(MYDATABASE)
server.name=(MYSERVER)
server.port=(PORT)
username=(MYUSER)
password=(MYPASS)
change.tracking.tables=(MYTABLE)

1 个答案:

答案 0 :(得分:0)

confluent load命令实际上是在后台运行的curl命令,可能更容易自己运行并得到正确的错误,将属性文件转换为mssqlsource.json文件:

{
  "name": "mssqlsource",
  "config": {
    "tasks.max": 2,
    "connector.class": "io.confluent.connect.cdc.mssql.MsSqlSourceConnector",
    "initial.database": "(MYDATABASE)",
    "server.name": "(MYSERVER)",
    "server.port": "(PORT)",
    "username": "(MYUSER)",
    "password": "(MYPASS)",
    "change.tracking.tables": "(MYTABLE)"
  }
}

然后运行curl -XPOST -H "Content-Type: application/json" --data @/path/to/mssqlsource.json connect-host:8083/connectors

这将为您提供更好的错误消息,并且更接近使用Kafka Connect的实际示例,因为confluent命令行用于本地开发环境。