我有一个Kafka主题,其中包含带有Avro序列化密钥和Avro序列化值的消息。
我正在尝试设置接收器连接器以将这些值放入postgres数据库(在这种情况下为AWS RDS)的表中。
我在主题,消息和接收器配置本身上尝试了多种变体,但是请看以下示例,如果有人可以提供有关我要去哪里的指南,那太好了!:)
我的主题具有以下架构(在架构注册表中)...
关键架构
{
"type": "record",
"name": "TestTopicKey",
"namespace": "test.messaging.avro",
"doc": "Test key schema.",
"fields": [
{
"name": "unitId",
"type": "int"
}
]
}
值架构
{
"type": "record",
"name": "TestTopicValues",
"namespace": "test.messaging.avro",
"doc": "Test value schema.",
"fields": [
{
"name": "unitPrice",
"type": "int",
"doc": "Price in AUD excluding GST."
},
{
"name": "unitDescription",
"type": "string"
}
]
}
我正在使用“ kafka-avro-console-producer”手动生成该主题的记录,如下所示:
/bin/kafka-avro-console-producer --broker-list kafka-box-one:9092 --topic test.units --property parse.key=true --property "key.separator=|" --property "schema.registry.url=http://kafka-box-one:8081" --property key.schema='{"type":"record","name":"TestTopicKey","namespace":"test.messaging.avro","doc":"Test key schema.","fields":[{"name":"unitId","type":"int"}]}' --property value.schema='{"type":"record","name":"TestTopicValues","namespace":"test.messaging.avro","doc":"Test value schema.","fields":[{"name":"unitPrice","type":"int","doc":"Price in AUD excluding GST."},{"name":"unitDescription","type":"string"}]}'
生产者启动后,我就可以成功地将记录添加到主题,如下所示:
{"unitId":111}|{"unitPrice":15600,"unitDescription":"A large widget thingy."}
NB:我也可以按预期使用kafka-avro-console-consumer成功消费。
我要插入的postgres表看起来像这样:
CREATE TABLE test_area.unit_prices (
unitId int4 NOT NULL,
unitPrice int4 NULL,
unitDescription text NULL,
CONSTRAINT unit_prices_unitid_pk PRIMARY KEY (unitId)
);
我的水槽连接器看起来像这样:
{
"name": "test.area.unit.prices.v01",
"config": {
"connector.class": "JdbcSinkConnector",
"topics": "test.units",
"group.id": "test.area.unit.prices.v01",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://kafka-box-one:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://kafka-box-one:8081",
"connection.user": "KafkaSinkUser",
"connection.password": "KafkaSinkPassword",
"connection.url": "jdbc:postgresql://unit-catalogue.abcdefghij.my-region-1.rds.amazonaws.com:5432/unit_sales?currentSchema=test_area",
"table.name.format": "unit_prices",
"auto.create": false,
"auto.evole": "false"
}
}
我的期望是在Sink显示为运行后不久,记录将出现在postgres表中。但是什么也没下。
其他说明:
答案 0 :(得分:1)
很抱歉对此人的答复。最终使日志正常工作后,这是一个代理问题。谢谢大家的帮助。