将数据从kafka复制到mysql,无法使用DOcker和Debezium连接到JDBCSinkConnector

时间:2019-11-06 08:36:27

标签: mysql apache-kafka-connect debezium

嗨,我正在使用debezium捕获Mongo中的更改并将其推送到mysql中 我正在使用下面的链接https://github.com/debezium/debezium-examples/tree/master/unwrap-mongodb-smt,其中我将最终的postgres db替换为mysql数据库,但无法这样做。

这是我修改后的jdbc-sink.json,我在其中使用mysql url进行连接。

{
    "name" : "jdbc-sink",
    "config" : {
        "connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
        "tasks.max" : "1",
        "topics" : "customers",
        "connection.url" : "jdbc:mysql://localhost:3306/inventorydb?user=user&password=password",
        "auto.create" : "true",
        "auto.evolve" : "true",
        "insert.mode" : "upsert",
        "delete.enabled": "true",
        "pk.fields" : "id",
        "pk.mode": "record_key",
        "transforms": "mongoflatten",
        "transforms.mongoflatten.type" : "io.debezium.connector.mongodb.transforms.ExtractNewDocumentState",
        "transforms.mongoflatten.drop.tombstones": "false"
    }
}

但是运行时出现以下错误

curl -i -X POST -H "Accept:application/json" -H  "Content-Type:application/json" http://localhost:8083/connectors/ -d @jdbc-sink.json
  

HTTP / 1.1 500内部服务器错误   日期:2019年11月6日,星期三,格林尼治标准时间   内容类型:application / json   内容长度:3404   服务器:Jetty(9.4.18.v20190429)

     

{“ 错误代码”:500,“消息”:“”找不到任何实现连接器且名称与io.confluent.connect.jdbc.JdbcSinkConnector匹配的类,可用的连接器为:PluginDesc { klass = class io.debezium.connector.mongodb.MongoDbConnector,name ='io.debezium.connector.mongodb.MongoDbConnector',version ='1.0.0-SNAPSHOT',encodingVersion = 1.0.0-SNAPSHOT,type = source,typeName ='源',位置='文件:/ kafka / connect / debezium-connector-mongodb /'},PluginDesc {klass = class io.debezium.connector.mysql.MySqlConnector,名称='io.debezium.connector.mysql。 MySqlConnector',版本='1.0.0-SNAPSHOT',encodedVersion = 1.0.0-SNAPSHOT,type = source,typeName ='source',location ='file:/ kafka / connect / debezium-connector-mysql /'}, PluginDesc {klass = class io.debezium.connector.oracle.OracleConnector,name ='io.debezium.connector.oracle.OracleConnector',version ='1.0.0-SNAPSHOT',encodingVersion = 1.0.0-SNAPSHOT,type = source ,typeName ='source',location ='file:/ kafka / connect / debezium-connector-orac le /'},PluginDesc {klass = class io.debezium.connector.postgresql.PostgresConnector,名称='io.debezium.connector.postgresql.PostgresConnector',版本='1.0.0-SNAPSHOT',编码版本= 1.0.0-快照,类型=源,类型名称=“源”,位置=“文件:/ kafka / connect / debezium-connector-postgres /'},PluginDesc {klass = class io.debezium.connector.sqlserver.SqlServerConnector,名称='io .debezium.connector.sqlserver.SqlServerConnector',version ='1.0.0-SNAPSHOT',encodingVersion = 1.0.0-SNAPSHOT,type = source,typeName ='source',location ='file:/ kafka / connect / debezium- connector-sqlserver /'},PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSinkConnector,名称='org.apache.kafka.connect.file.FileStreamSinkConnector',版本='2.3.0',encodedVersion = 2.3.0,type = sink,typeName ='sink',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSourceConnector,name ='org.apache.kafka.connect.file .FileStreamSourceConnector',版本=“ 2.3.0”,encodedVersion = 2.3.0,类型=酸ce,typeName ='source',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockConnector,name ='org.apache.kafka.connect.tools.MockConnector',version = “ 2.3.0”,encodedVersion = 2.3.0,type = connector,typeName ='connector',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockSinkConnector,name ='org .apache.kafka.connect.tools.MockSinkConnector',版本=“ 2.3.0”,encodedVersion = 2.3.0,type = sink,typeName ='sink',location ='classpath'},PluginDesc {klass = class org。 apache.kafka.connect.tools.MockSourceConnector,名称='org.apache.kafka.connect.tools.MockSourceConnector',版本='2.3.0',encodedVersion = 2.3.0,type = source,typeName ='source', location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.SchemaSourceConnector,名称='org.apache.kafka.connect.tools.SchemaSourceConnector',版本='2.3.0',encodedVersion = 2.3.0,type = source,typeName ='source',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tool s.VerifiableSinkConnector,名称=“ org.apache.kafka.connect.tools.VerifiableSinkConnector”,版本=“ 2.3.0”,encodedVersion = 2.3.0,type = source,typeName =“ source”,location =“ classpath”} ,PluginDesc {klass = class org.apache.kafka.connect.tools.VerifiableSourceConnector,名称='org.apache.kafka.connect.tools.VerifiableSourceConnector',版本='2.3.0',encodedVersion = 2.3.0,type =源,typeName =“源”,位置=“类路径”}“}

我知道有些人找不到io.confluent.connect.jdbc.JdbcSinkConnector,但是我应该如何/以及在哪里保存这样的jar。

谢谢

2 个答案:

答案 0 :(得分:0)

您尚未在Kafka Connect中提供接收器连接器,请参阅命令docker-compose up --build -d用于使用https://github.com/debezium/debezium-examples/blob/master/unwrap-mongodb-smt/debezium-jdbc/Dockerfile#L10中烘焙的JDBC接收器连接器来启动构建新的Connect映像。

答案 1 :(得分:0)

使用以下命令下载 JAR 并将其保存在 plugins 目录中:

curl -sO https://packages.confluent.io/maven/io/confluent/kafka-connect-jdbc/10.0.0/kafka-connect-jdbc-10.0.0.jar