Kafka Connect docker image未找到JDBC连接插件

时间:2018-03-26 15:20:33

标签: jdbc apache-kafka apache-kafka-connect

Kafka Connect新手,无法使基本的JDBC源工作。我使用以下docker-compose.yml代码段来创建图片:

kafka-connect_fronting:
image: confluentinc/cp-kafka-connect
container_name: kafka-connect_fronting
hostname: connect_fronting
depends_on:
  - zookeeper_fronting
  - kafka_fronting
  - schema-registry_fronting
ports:
  - "8083:8083"
volumes:
  - ./jars:/etc/kafka-connect/jars/
environment:
  CONNECT_BOOTSTRAP_SERVERS: 'kafka_fronting:29092'
  CONNECT_REST_ADVERTISED_HOST_NAME: connect
  CONNECT_REST_PORT: 8083
  CONNECT_GROUP_ID: compose-connect-group
  CONNECT_CONFIG_STORAGE_TOPIC: docker-connect-configs
  CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
  CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000
  CONNECT_OFFSET_STORAGE_TOPIC: docker-connect-offsets
  CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
  CONNECT_STATUS_STORAGE_TOPIC: docker-connect-status
  CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
  CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
  CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry_fronting:8081'
  CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
  CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry_fronting:8081'
  CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
  CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
  CONNECT_ZOOKEEPER_CONNECT: 'zookeeper_fronting:32181'
  CONNECT_PLUGIN_PATH: '/etc/kafka-connect/jars'

图像似乎正常启动:

curl http://localhost:8083/

{"版本":" 1.0.0-CP1""提交":" ec61c5e93da662df"}

并确认mysql jar存在于image:

root@connect_fronting:~# ls -la /etc/kafka-connect/jars
total 980
drwxr-xr-x 4 root root    128 Mar 26 13:20 .
drwxrwxrwx 1 root root   4096 Mar 26 13:56 ..
-rw-r--r-- 1 root root 989497 May  4  2016 mysql-connector-java-5.1.39-bin.jar

但是当我尝试创建连接器时遇到了麻烦:

curl -X POST \
  -H "Content-Type: application/json" \
  --data '{ "name": "quickstart-jdbc-source", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "tasks.max": 1, "connection.url": "jdbc:mysql://127.0.0.1:3306/maintenance?user=root&password=superset", "mode": "incrementing", "incrementing.column.name": "id", "timestamp.column.name": "modified", "topic.prefix": "quickstart-jdbc-", "poll.interval.ms": 1000 } }' \
  http://kafka-connect_fronting:8083/connectors

{"error_code":500,"message":"Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSourceConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}"}

当我ping API时,我只看到了文件源/接收插件:

$curl http://localhost:8083/connector-plugins/
[{"class":"org.apache.kafka.connect.file.FileStreamSinkConnector","type":"sink","version":"1.0.0-cp1"},{"class":"org.apache.kafka.connect.file.FileStreamSourceConnector","type":"source","version":"1.0.0-cp1"}

文档说明包含了JBDC。我必须在这里缺少一些基本的东西,并感谢任何帮助。

1 个答案:

答案 0 :(得分:2)

您的CONNECT_PLUGIN_PATH还需要包含Confluent平台附带的Connect插件的位置,此外还有存储自定义插件的位置(我假设您的示例中为/etc/kafka-connect/jars)。此属性是一个列表,因此需要多个路径。

我的猜测是:

CONNECT_PLUGIN_PATH: '/usr/share/java,/etc/kafka-connect/jars'

会奏效。但是您必须仔细检查您正在使用的特定Docker映像是否存储Confluent&#39的连接插件确实位于:/usr/share/java下。

这里的docker快速入门更多详细信息:https://docs.confluent.io/current/installation/docker/docs/quickstart.html#kafka-connect