Kafka Connect JDBC Source Connector不适用于Microsoft SQL Server

时间:2018-03-22 15:58:28

标签: sql-server jdbc apache-kafka apache-kafka-connect

我已经设置了一个以分布式模式运行的Kafka Connect的dockerized集群。 我正在尝试设置Kafka JDBC Source Connector以在Microsoft SQL Server和Kafka之间移动数据。

以下是我的connector-plugins api

的回复输出
[
    {
    class: "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
    type: "sink",
    version: "4.0.0"
    },
    {
    class: "io.confluent.connect.hdfs.HdfsSinkConnector",
    type: "sink",
    version: "4.0.0"
    },
    {
    class: "io.confluent.connect.hdfs.tools.SchemaSourceConnector",
    type: "source",
    version: "1.0.0-cp1"
    },
    {
    class: "io.confluent.connect.jdbc.JdbcSinkConnector",
    type: "sink",
    version: "4.0.0"
    },
    {
    class: "io.confluent.connect.jdbc.JdbcSourceConnector",
    type: "source",
    version: "4.0.0"
    },
    {
    class: "io.debezium.connector.mongodb.MongoDbConnector",
    type: "source",
    version: "0.7.4"
    },
    {
    class: "io.debezium.connector.mysql.MySqlConnector",
    type: "source",
    version: "0.7.4"
    },
    {
    class: "org.apache.kafka.connect.file.FileStreamSinkConnector",
    type: "sink",
    version: "1.0.0-cp1"
    },
    {
    class: "org.apache.kafka.connect.file.FileStreamSourceConnector",
    type: "source",
    version: "1.0.0-cp1"
    }
]

我已将JDBC Driver provided my Microsoft SQL Server添加到我的Kafka Connect群集中的plugins path

以下是我connectors api的输入,

curl -X POST \
  http://kafka-connect-cluster.com/connectors \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d '{
"name": "mssql-source-connector",
"config": {
        "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
        "mode": "timestamp",
        "timestamp.column.name": "updateTimeStamp",
        "query": "select * from table_name",
        "tasks.max": "1",
        "table.types": "TABLE",
        "key.converter.schemas.enable": "false",
        "topic.prefix": "data_",
        "value.converter.schemas.enable": "false",
        "connection.url": "jdbc:sqlserver://<host>:<port>;databaseName=<dbName>;",
        "connection.user": "<username>",
        "connection.password": "<password>",
        "value.converter": "org.apache.kafka.connect.json.JsonConverter",
        "key.converter": "org.apache.kafka.connect.json.JsonConverter",
        "poll.interval.ms": "5000",
        "table.poll.interval.ms": "120000"
    }
}'

尝试此查询时出现的错误如下:

{
    "error_code": 400,
    "message": "Connector configuration is invalid and contains the following 2 error(s):\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:sqlserver://<host>:<port>;databaseName=<db_name>; for configuration Couldn't open connection to jdbc:sqlserver://<host>:<port>;databaseName=<db_name>;\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:sqlserver://<host>:<port>;databaseName=<db_name;> for configuration Couldn't open connection to jdbc:sqlserver://<host>:<port>;databaseName=<db_name;>\nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"
}

我们非常感谢您提供的任何帮助。

由于

2 个答案:

答案 0 :(得分:5)

归功于@rmoff的答案,指出我正确的方向。

所以这个问题在两个地方撒谎。

  1. 这更像是一个FYI,而不是一个问题。我为docker镜像提供了自定义CONNECT_PLUGIN_PATH。有 这样做并没有错,但它通常不是一个好主意 因为你必须复制所有可用的基本插件 融合平台,这可能会在移动到新的时产生问题 版本,因为您可能需要再次执行相同的过程。
  2. 这部分是最重要的。 SQLServer JDBC driver需要 与...的文件夹相同 kafka-connect-jdbc-<confluent-version>.jar在我的情况下是 kafka-connect-jdbc-4.0.0.jar
  3. 解决了这两点后,我的SQLServer JDBC驱动程序开始按预期工作。

答案 1 :(得分:1)

根据https://docs.microsoft.com/en-us/sql/connect/jdbc/building-the-connection-url,您在网址中的结尾#0 0x00007fff9f3a9b70 in ?? () from /usr/lib/x86_64-linux-gnu/libltdl.so.7 #1 0x00007fff9f3ab874 in ?? () from /usr/lib/x86_64-linux-gnu/libltdl.so.7 #2 0x00007fff9f3ac175 in lt_dlopenadvise () from /usr/lib/x86_64-linux-gnu/libltdl.so.7 #3 0x00007fff9f3ac240 in lt_dlopenext () from /usr/lib/x86_64-linux-gnu/libltdl.so.7 #4 0x00007fffd3383b0e in gp_abilities_list_load_dir () from /usr/lib/x86_64-linux-gnu/libgphoto2.so.6 #5 0x00007fffd3383de9 in gp_abilities_list_load () from /usr/lib/x86_64-linux-gnu/libgphoto2.so.6 #6 0x00007fffd338645c in gp_camera_autodetect () from /usr/lib/x86_64-linux-gnu/libgphoto2.so.6 #7 0x00007fffee0c5765 in cv::gphoto2::DigitalCameraCapture::initContext() () from /usr/local/lib/libopencv_videoio.so.3.4 #8 0x00007fffee0ccd89 in cv::gphoto2::DigitalCameraCapture::DigitalCameraCapture(cv::String const&) () from /usr/local/lib/libopencv_videoio.so.3.4 #9 0x00007fffee0cd2e6 in cv::createGPhoto2Capture(cv::String const&) () from /usr/local/lib/libopencv_videoio.so.3.4 #10 0x00007fffee0a171f in cv::VideoCapture::open(cv::String const&, int) () from /usr/local/lib/libopencv_videoio.so.3.4 #11 0x00007fffee0a26cb in cv::VideoCapture::VideoCapture(cv::String const&) () from /usr/local/lib/libopencv_videoio.so.3.4 #12 0x00007ffff5864e27 in pyopencv_cv_VideoCapture_VideoCapture(pyopencv_VideoCapture_t*, _object*, _object*) () from /usr/local/lib/python2.7/dist-packages/cv2.so #13 0x00000000004aa9ab in type_call.lto_priv () at ../Objects/typeobject.c:765 #14 0x00000000004c15bf in PyObject_Call (kw=0x0, arg=(u'rtsp://admin:dsal12345@192.168.1.104:554',), func=<type at remote 0x7ffff5e47200>) at ../Objects/abstract.c:2546 #15 do_call (nk=<optimized out>, na=<optimized out>, pp_stack=0x7ffef64524c0, func=<type at remote 0x7ffff5e47200>) at ../Python/ceval.c:4567 #16 call_function (oparg=<optimized out>, pp_stack=0x7ffef64524c0) at ../Python/ceval.c:4372 #17 PyEval_EvalFrameEx () at ../Python/ceval.c:2987 #18 0x00000000004c136f in fast_function (nk=<optimized out>, na=<optimized out>, n=1, pp_stack=0x7ffef64525e0, func=<function at remote 0x7fff89a6caa0>) at ../Python/ceval.c:4435 #19 call_function (oparg=<optimized out>, pp_stack=0x7ffef64525e0) at ../Python/ceval.c:4370 #20 PyEval_EvalFrameEx () at ../Python/ceval.c:2987 #21 0x00000000004b9ab6 in PyEval_EvalCodeEx () at ../Python/ceval.c:3582 #22 0x00000000004d55f3 in function_call.lto_priv () at ../Objects/funcobject.c:523 #23 0x00000000004a577e in PyObject_Call () at ../Objects/abstract.c:2546 #24 0x00000000004bed3d in ext_do_call (nk=<optimized out>, na=<optimized out>, flags=<optimized out>, pp_stack=0x7ffef6452868, func=<function at remote 0x7fff55a78c80>) at ../Python/ceval.c:4664 #25 PyEval_EvalFrameEx () at ../Python/ceval.c:3026 无效。 还可以尝试将JDBC驱动程序放在;中,和/或将其添加到share/java/kafka-connect-jdbc环境变量中。