Kafka Producer无法验证没有PK的记录并返回InvalidRecordException

时间:2020-04-14 02:11:27

标签: postgresql apache-kafka apache-kafka-connect debezium

我的卡夫卡制片人有错误。我使用Debezium Kafka连接器V1.1.0 Final和Kafka 2.4.1。对于带有pk的表,所有表都被清除,但是不幸的是,对于没有pk的表,它给了我这个错误:

[2020-04-14 10:00:00,096] INFO   Exporting data from table 'public.table_0' (io.debezium.relational.RelationalSnapshotChangeEventSource:280)
[2020-04-14 10:00:00,097] INFO   For table 'public.table_0' using select statement: 'SELECT * FROM "public"."table_0"' (io.debezium.relational.RelationalSnapshotChangeEventSource:287)
[2020-04-14 10:00:00,519] INFO   Finished exporting 296 records for table 'public.table_0'; total duration '00:00:00.421' (io.debezium.relational.RelationalSnapshotChangeEventSource:330)
[2020-04-14 10:00:00,522] INFO Snapshot - Final stage (io.debezium.pipeline.source.AbstractSnapshotChangeEventSource:79)
[2020-04-14 10:00:00,523] INFO Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfo=source_info[server='postgres'db='xxx, lsn=38/C74913C0, txId=4511542, timestamp=2020-04-14T02:00:00.517Z, snapshot=FALSE, schema=public, table=table_0], partition={server=postgres}, lastSnapshotRecord=true]] (io.debezium.pipeline.ChangeEventSourceCoordinator:90)
[2020-04-14 10:00:00,524] INFO Connected metrics set to 'true' (io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics:59)
[2020-04-14 10:00:00,526] INFO Starting streaming (io.debezium.pipeline.ChangeEventSourceCoordinator:100)
[2020-04-14 10:00:00,550] ERROR WorkerSourceTask{id=pg_dev_pinjammodal-0} failed to send record to table_0: (org.apache.kafka.connect.runtime.WorkerSourceTask:347)
org.apache.kafka.common.InvalidRecordException: This record has failed the validation on broker and hence be rejected.

我检查了表,它似乎是有效的记录。我在配置中设置了生产者producer.ack=1。此配置会在此处触发无效吗?

1 个答案:

答案 0 :(得分:6)

我已经检查了日志,而我的错误是通过非密钥表的日志压缩设置kafka主题。消息没有密钥,因为表没有pk,这使得代理无法验证kafka消息。因此,如果您的桌上没有pk并想将其推送到kafka,请不要将日志压缩设置为主题。