Kafka Connect-如何使用转换自定义json

时间:2019-04-11 14:21:10

标签: apache-kafka apache-kafka-connect

我正在努力实现

{
"source": "NEWS",
"metadata": {
"publishTime": "02/06/2019 09:56:24.317",
"channel": "paper",
"status":"active"
},
"Data": {
"NAME": 67,
"GENDER": "MALE",
    ...
}
}

但是我对此感到困惑

{
    "Data": {
        "NAME": 67,
        "GENDER": "MALE",
    ...

    },
    "source": "NEWS",
    "metadata": "{\"channel\":'paper'}"
}

和下面是我的connector.properties

name=source-sqlserver-user
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=jdbc:sqlserver://localhost:1433;database=testing;username=xyz;password=xyz;
table.whitelist=Tickets
mode=incrementing 
incrementing.column.name=Ticket_id
validate.non.null=false
topic.prefix=my-mssql-
transforms=MakeMap123,value,extract,InsertSourceDetails,xyz
transforms.MakeMap123.type=org.apache.kafka.connect.transforms.HoistField$Value
transforms.MakeMap123.field=Data
transforms.value.type=org.apache.kafka.connect.transforms.ValueToKey
transforms.value.fields=Ticket_id
transforms.extract.type=org.apache.kafka.connect.transforms.ExtractField$Value
transforms.extract.field=Ticket_id
#transforms.InsertTopic.type=org.apache.kafka.connect.transforms.InsertField$Value
#transforms.InsertTopic.topic.field=messagetopic
eetransforms.InsertSourceDetails.type=org.apache.kafka.connect.transforms.InsertField$Value
transforms.InsertSourceDetails.static.field=source
transforms.InsertSourceDetails.static.value=NEWS
transforms.xyz.type=org.apache.kafka.connect.transforms.InsertField$Value
transforms.xyz.static.field=metadata
transforms.xyz.static.value={"channel":'paper'}

在这里source:NEWS是静态字段,并作为例外进行工作,但是有没有可能我们提取同一张表的几列(所有列都属于一个表)并将其作为另一个json键(元数据)就我而言)。是的,我尝试了ValueToKeyExtractField$Value,但是ValueToKey抛出了NPE

使用kafka connect转换是否可以实现?还是我错过了什么? 还是我应该使用自定义的Avro模式?自定义Avro架构的任何示例都将受到赞赏。

0 个答案:

没有答案