如何解决Kafka Connect JSONConverter“模式必须包含'type'字段”

时间:2019-12-30 17:26:57

标签: jdbc apache-kafka apache-kafka-connect confluent

我正在尝试向JdbcSink发送消息,消息如下所示

{
  "schema": {
    "type": "struct",
      "fields": [{
        "field": "ID",
        "type": {
          "type": "bytes",
          "scale": 0,
          "precision": 64,
          "connect.version": 1,
          "connect.parameters": {
            "scale": "0"
          },
          "connect.name": "org.apache.kafka.connect.data.Decimal",
          "logicalType": "decimal"
        }
      }, {
        "field": "STORE_DATE",
        "type": ["null", {
          "type": "long",
          "connect.version": 1,
          "connect.name": "org.apache.kafka.connect.data.Timestamp",
          "logicalType": "timestamp-millis"
        }],
        "default": null
      }, {
        "field": "DATA",
        "type": ["null", "string"],
        "default": null
      }],
        "name": "KAFKA_STREAM"
  },
    "payload": {
      "ID": 17,
        "STORE_DATE": null,
          "DATA": "THIS IS TEST DATA"
    }
}

但是它总是抛出错误Caused by: org.apache.kafka.connect.errors.DataException: Schema must contain 'type' field

这是当前正在使用的连接器配置

{
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "topics": "DEV_KAFKA_STREAM",
    "connection.url": "url",
    "connection.user": "user",
    "connection.password": "password",
    "insert.mode": "insert",
    "table.name.format": "KAFKA_STREAM",
    "pk.fields": "ID",
    "auto.create": "false",
    "errors.log.enable": "true",
    "errors.log.include.messages": "true",
    "value.converter": "org.apache.kafka.connect.json.JsonConverter",
    "value.converter.schemas.enable": "true"
}

由于json确实具有type字段,因此不确定如何调试此问题或如何找到根本原因

1 个答案:

答案 0 :(得分:0)

据我所知,type Foo<T> = (x: T)=>void不是有效的架构类型。

您想要type Bar = <T>(x: T)=>void

JSON Schema source code

您可能还希望删除联合。有一个Bar键来指定可为空的字段

Kafka Connect JDBC sink connector not working

如果要在Java中创建JSON,则应在两个JSONNode对象周围使用SchemaBuilder和Envelope类,以确保正确构建有效负载