kafka jdbc接收器连接器独立错误

时间:2019-01-29 09:40:48

标签: jdbc apache-kafka apache-kafka-connect confluent

我正在尝试将数据从kafka中的主题插入到postgres数据库中。我正在使用以下命令加载

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-quickstart-mysql.properties

sink-quickstart-mysql.properties如下

name=test-sink-mysql-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=third_topic
connection.url=jdbc:postgres://localhost:5432/postgres
connection.user=postgres
connection.password=postgres
auto.create=true

我得到的错误是

  

[2019-01-29 13:16:48,859]错误无法为以下项目创建作业   /home/ashley/confluent-5.1.0/etc/kafka-connect-jdbc/sink-quickstart-mysql.properties   (org.apache.kafka.connect.cli.ConnectStandalone:102)[2019-01-29   13:16:48,862]错误连接器错误后停止   (org.apache.kafka.connect.cli.ConnectStandalone:113)   java.util.concurrent.ExecutionException:   org.apache.kafka.connect.errors.ConnectException:找不到任何   实现Connector且名称匹配的类   io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器是:   PluginDesc {klass = class   org.apache.kafka.connect.file.FileStreamSinkConnector,   name ='org.apache.kafka.connect.file.FileStreamSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=接收器,   typeName ='sink',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.file.FileStreamSourceConnector,   name ='org.apache.kafka.connect.file.FileStreamSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockConnector,   name ='org.apache.kafka.connect.tools.MockConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=连接器,   typeName ='connector',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockSinkConnector,   name ='org.apache.kafka.connect.tools.MockSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=接收器,   typeName ='sink',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockSourceConnector,   name ='org.apache.kafka.connect.tools.MockSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.SchemaSourceConnector,   name ='org.apache.kafka.connect.tools.SchemaSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.VerifiableSinkConnector,   name ='org.apache.kafka.connect.tools.VerifiableSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.VerifiableSourceConnector,   name ='org.apache.kafka.connect.tools.VerifiableSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='source',location ='classpath'},位于   org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)     在   org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)     在   org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110)   引起原因:org.apache.kafka.connect.errors.ConnectException:失败   查找实现Connector且名称匹配的任何类   io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器是:   PluginDesc {klass = class   org.apache.kafka.connect.file.FileStreamSinkConnector,   name ='org.apache.kafka.connect.file.FileStreamSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=接收器,   typeName ='sink',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.file.FileStreamSourceConnector,   name ='org.apache.kafka.connect.file.FileStreamSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockConnector,   name ='org.apache.kafka.connect.tools.MockConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=连接器,   typeName ='connector',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockSinkConnector,   name ='org.apache.kafka.connect.tools.MockSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=接收器,   typeName ='sink',location ='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.MockSourceConnector,   name ='org.apache.kafka.connect.tools.MockSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.SchemaSourceConnector,   name ='org.apache.kafka.connect.tools.SchemaSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.VerifiableSinkConnector,   name ='org.apache.kafka.connect.tools.VerifiableSinkConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='源',位置='classpath'},PluginDesc {klass = class   org.apache.kafka.connect.tools.VerifiableSourceConnector,   name ='org.apache.kafka.connect.tools.VerifiableSourceConnector',   版本='2.1.0-cp1',已编码版本= 2.1.0-cp1,类型=来源,   typeName ='source',location ='classpath'},位于   org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:179)     在   org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)     在   org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)     在   org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)     在   org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107)   [2019-01-29 13:16:48,886]信息Kafka Connect停止   (org.apache.kafka.connect.runtime.Connect:65)[2019-01-29   13:16:48,886]信息停止REST服务器   (org.apache.kafka.connect.runtime.rest.RestServer:223)[2019-01-29   13:16:48,894]信息已停止   http_8083 @ dc4fee1 {HTTP / 1.1,[http / 1.1]} {0.0.0.0:8083}   (org.eclipse.jetty.server.AbstractConnector:341)[2019-01-29   13:16:48,895] INFO node0停止清理   (org.eclipse.jetty.server.session:167)[2019-01-29 13:16:48,930]信息   已停止o.e.j.s.ServletContextHandler@3c46dcbe {/,null,UNAVAILABLE}   (org.eclipse.jetty.server.handler.ContextHandler:1040)[2019-01-29   13:16:48,943] INFO REST服务器已停止   (org.apache.kafka.connect.runtime.rest.RestServer:241)[2019-01-29   13:16:48,943]信息放牧停止   (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95)   [2019-01-29 13:16:48,944]信息工作者正在停止   (org.apache.kafka.connect.runtime.Worker:184)[2019-01-29   13:16:48,944]信息已停止FileOffsetBackingStore   (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)   [2019-01-29 13:16:48,947] INFO工作人员停止了   (org.apache.kafka.connect.runtime.Worker:205)[2019-01-29   13:16:48,950]信息牧民停止了   (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:112)   [2019-01-29 13:16:48,951]信息Kafka Connect已停止   (org.apache.kafka.connect.runtime.Connect:70)

该文件夹中已经存在postgres jar文件。有人可以建议吗?

谢谢, 阿什利

1 个答案:

答案 0 :(得分:1)

这行是您的日志中最重要的行:

  

java.util.concurrent.ExecutionException:   org.apache.kafka.connect.errors.ConnectException:找不到任何   实现Connector且名称匹配的类   io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器   是:...

似乎您没有安装kafka-connect-jdbc连接器

Check your plugin.path property in etc/schema-registry/connect-avro-standalone.properties,并确保plugin.path的行未注释。

如果不使用Confluent Platform,则需要在该plugin.path目录下创建jdbc插件的另一个目录:ex。 kafka-connect-jdbc,然后将所有需要的罐子放在这里。 kafka-connect-jdbc-5.1.0.jar,其依存关系以及您的jdbc驱动程序。

可以找到更多详细信息:https://docs.confluent.io/current/connect/userguide.html#installing-plugins

相关问题