我正在尝试将消息从kafka主题(数据以avro格式)推送到Postgres表。我拥有创建/插入/更新/删除数据库/表的全部特权。第一次运行接收器连接器时,它会自动创建一个表并加载所有数据,但是当我停止连接器并再次尝试将新数据加载到现有数据中时,它会给出如下错误:
Caused by: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO "testing" ("EMPID","TS","EMPNAME","EMPSALARY") VALUES ('abc123','2019:01:23','john',10) ON CONFLICT ("EMPID") DO UPDATE SET "TS"=EXCLUDED."TS","EMPNAME"=EXCLUDED."EMPNAME","EMPSALARY"=EXCLUDED."HITS" was aborted. Call getNextException to see the cause.
org.postgresql.util.PSQLException: ERROR: relation "testing" does not exist
Position: 13
这是我的接收器连接器配置
"name": "test",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "emp_data",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"connection.url": "jdbc:postgresql://localhost:5432/temp",
"connection.user": "root",
"connection.password": "pwd",
"compact.map.entries": "false",
"insert.mode": "upsert",
"batch.size": "1",
"table.name.format": "testing",
"pk.mode":"record_value",
"pk.fields":"EmpID"
"fields.whitelist": "timestamp,empid,empname, empsalary",
"key.ignore": "true",
"auto.create": "false",
"auto.evolve": "true",
"type.connect": "kafka-connect"
我已经在数据库中创建了自己的表,然后尝试将数据推入表中,但是没有任何反应。
任何帮助将不胜感激!