如何在jdbc-sink-connector模式信息中指定数据类型“日期”?

时间:2020-07-23 13:23:31

标签: apache-kafka apache-kafka-connect

  • 我是kafka-connect的新手
  • 我想使用oracle-jdbc-sink-connector将数据从kafka主题写入oracle
  • 我目前无法使用avro / schema-registry,因此它必须是数据集中包含模式信息的JSON
  • 具有简单数据类型(如int,string)的“ hello world”数据集已成功写入通过连接器自动创建功能创建的oracle表

现在我被困住了,因为我想在我的有效载荷中使用“日期”数据类型,但我不知道如何在模式信息中指定它。

这是错误消息(connect-distributed.log)

ERROR WorkerSinkTask{id=oracle-sink-connector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:488)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:465)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.DataException: Unknown schema type: org.apache.kafka.connect.data.Date
        at org.apache.kafka.connect.json.JsonConverter.asConnectSchema(JsonConverter.java:528)
        at org.apache.kafka.connect.json.JsonConverter.asConnectSchema(JsonConverter.java:524)
        at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:371)
        at org.apache.kafka.connect.storage.Converter.toConnectData(Converter.java:86)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$2(WorkerSinkTask.java:488)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
        at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
        ... 13 more

这是我的JSON的相关部分:

{"records":
    [{"value":  
        {"schema": 
            { "type": "struct", "fields": 
                [{"type": "org.apache.kafka.connect.data.Date", "optional": false, "field": "BEZDAT"}
                ...]
                , "optional": false, "name": "foobar"}
        , "payload": {"BEZDAT": "2020-07-15"
        ...}}]}'

我还尝试将Date,'Date',“ Date”,Date用作 type

根据documentation,可以使用“日期”。但是如何指定?我想念什么吗?

0 个答案:

没有答案