我正在尝试在Windows上将kafka与mysql连接。我没有使用合流。我的卡夫卡版本是2.12。我已经开始了Zookeeper,卡夫卡,生产者和消费者,一切正常。
我的MysQL版本是8.0.15
我已经将这3个jar文件复制到libs文件夹中
mysql-connector-java-8.0.15.jar
mysql-connector-java-5.1.47.jar
mysql-connector-java-5.1.47-bin.jar
我的source-quickstart-mysql.properties文件代码是
name=test-source-mysql-jdbc-autoincrement connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=jdbc:mysql://localhost:3306/databasename?
user=rootname&password=password
mode=incrementing
incrementing.column.name=ID
topic.prefix=my-replicated-topic-table1
当我运行命令时
connect-standalone.bat ..\..\config\connect-standalone.properties ..\..\config\source-quickstart-mysql.properties
我在控制台上收到此错误
[2019-03-26 16:16:39,524]错误无法为.... \ config \ source-quickstart-mysql.properties创建作业 (org.apache.kafka.connect.cli.ConnectStandalone) [2019-03-26 16:16:39,524]错误连接器错误后停止(org.apache.kafka.connect.cli.ConnectStandalone) java.util.concurrent.ExecutionException:org.apache.kafka.connect.errors.ConnectException:找不到任何 实现Connector且名称匹配的类 io.confluent.connect.jdbc.JdbcSourc eConnector,可用的连接器为:PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSinkConnector, name ='org.apache.kafka.connect.file.FileStreamSinkConnector', 版本='2.1.0',编码 edVersion = 2.1.0,type = sink,typeName ='sink',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSourceConnector, 名称='org.apache.kafka.connect.file.FileStreamSource 连接器',版本='2.1.0',已编码版本= 2.1.0,类型=源,类型名称='源',位置='类路径'}, PluginDesc {klass = class org.apache.kafka.connect.tools.MockConnector, 名称='org.apache.kafka.co nnect.tools.MockConnector',版本='2.1.0',encodedVersion = 2.1.0,type = connector,typeName ='connector', location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockSinkConnector, name ='org.apache.kafka.connect.tools.MockSinkConnector',版本='2.1.0',encodedVersion = 2.1.0,type = sink,typeName ='sink', location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tool s.MockSourceConnector,名称='org.apache.kafka.connect.tools.MockSourceConnector', 版本='2.1.0',encodedVersion = 2.1.0,类型=源,类型名称='源', location ='classpath'},PluginDesc {klass = class o rg.apache.kafka.connect.tools.SchemaSourceConnector,名称='org.apache.kafka.connect.tools.SchemaSourceConnector', 版本='2.1.0',encodedVersion = 2.1.0,类型=源,类型名称='源', location ='class 路径”},PluginDesc {klass = class org.apache.kafka.connect.tools.VerifiableSinkConnector, name ='org.apache.kafka.connect.tools.VerifiableSinkConnector', 版本='2.1.0',已编码版本= 2.1.0,类型=来源 ,typeName ='source',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.VerifiableSourceConnector, name ='org.apache.kafka.connect.tools.VerifiableSourceConnector', 版本='2 .1.0”,encodedVersion = 2.1.0,type = source,typeName ='source',location ='classpath'} 在org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79) 在org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66) 在org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110) 原因:org.apache.kafka.connect.errors.ConnectException:找不到任何 实现Connector且名称匹配的类 io.confluent.connect.jdbc.JdbcSourceConnector,可用的connecto rs是:PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSinkConnector, name ='org.apache.kafka.connect.file.FileStreamSinkConnector', 版本='2.1.0',已编码版本= 2.1.0,类型=接收器,类型 peName ='sink',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.file.FileStreamSourceConnector, name ='org.apache.kafka.connect.file.FileStreamSourceConnector', 版本='2.1.0',e ncodedVersion = 2.1.0,type =源,typeName ='源',位置='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockConnector, name ='org.apache.kafka.connect.tools.MockConnector',ve rsion ='2.1.0',encodedVersion = 2.1.0,type = connector,typeName ='connector',location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockSinkConnector, 名称='org.apache.kafka.connec t.tools.MockSinkConnector',版本=“ 2.1.0”,encodedVersion = 2.1.0,type = sink,typeName ='sink', location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.MockSourceConnector,名称='o rg.apache.kafka.connect.tools.MockSourceConnector',版本=“ 2.1.0”,encodedVersion = 2.1.0,type = source,typeName =“ source”, location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools。 SchemaSourceConnector,名称='org.apache.kafka.connect.tools.SchemaSourceConnector', 版本='2.1.0',encodedVersion = 2.1.0,类型=源,类型名称='源', location ='classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.VerifiableSinkConnector,名称='org.apache.kafka.connect.tools.VerifiableSinkConnector', 版本='2.1.0',encodedVersion = 2.1.0,类型=源,类型名称='源', 位置= 'classpath'},PluginDesc {klass = class org.apache.kafka.connect.tools.VerifiableSourceConnector, name ='org.apache.kafka.connect.tools.VerifiableSourceConnector', 版本=“ 2.1.0”,encodedVersion = 2.1.0,t ype = source,typeName ='source',location ='classpath'} 在org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:179) 在org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382) 在org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261) 在org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189) 在org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107)
请帮助我。
我也尝试了这篇文章,但是没有通过命令生成任何输出 bin /汇合负载jdbc-source -d jdbc-source.properties https://supergloo.com/kafka-connect/kafka-connect-mysql-example/
答案 0 :(得分:1)
您的错误是
org.apache.kafka.connect.errors.ConnectException: Failed to find any class that
implements Connector and which name matches io.confluent.connect.jdbc.JdbcSourceConnector
既然您说您不使用Confluent Platform,那是有道理的,因为kafka-connect-jdbc不是Apache Kafka的一部分。您可以使用Confluent平台,从source构建连接器,也可以从http://hub.confluent.io下载连接器。
答案 1 :(得分:0)
如果在kafka lib路径中添加kafka-connect-jdbc-5.5.1.jar并重新启动kafka zookeeper和服务器。您应该可以连接。
答案 2 :(得分:0)
您可以从https://www.confluent.io/hub/confluentinc/kafka-connect-jdbc下载Kafka Connect JDBC,它是免费的,并且无需Confluent平台即可使用。解压缩后,更新已安装的Apache Kafka的config目录中的connect-standalone.properties中的键 plugin.path 的confluentinc-kafka-connect-jdbc-5.5.1.jar的位置。再次运行脚本后,错误将消失。