尝试将RDS MySQL表流式传输到Redshift时出现此错误:转换数据时出错,参数的类型无效
问题字段在MySQL中为DATETIME
,在Redshift中为timestamp without time zone
(timestamp with time zone
也会发生同样的情况)。注意:在我填充日期字段之前,管道运行良好。
我们将Debezium用作Kafka Connect源,用于将数据从RDS获取到Kafka。 JDBC接收器连接器和用于接收器的Redshift JDBC驱动程序。
此外...如果我将Redshift字段设置为varchar
或bigint
,则能够使数据流动。当我这样做时,我看到数据以毫秒为单位的unix历元整数。但是我们真的想要一个时间戳!
上下文中的错误消息:
2018-10-18 22:48:32,972 DEBUG || INSERT sql: INSERT INTO "funschema"."test_table"("user_id","subscription_code","source","receipt","starts_on") VALUES(?,?,?,?,?) [io.confluent.connect.jdbc.sink.BufferedRecords]
2018-10-18 22:48:32,987 WARN || Write of 28 records failed, remainingRetries=7 [io.confluent.connect.jdbc.sink.JdbcSinkTask]
java.sql.BatchUpdateException: [Amazon][JDBC](10120) Error converting data, invalid type for parameter: 5.
at com.amazon.jdbc.common.SStatement.createBatchUpdateException(Unknown Source)
at com.amazon.jdbc.common.SStatement.access$100(Unknown Source)
at com.amazon.jdbc.common.SStatement$BatchExecutionContext.createBatchUpdateException(Unknown Source)
at com.amazon.jdbc.common.SStatement$BatchExecutionContext.createResults(Unknown Source)
at com.amazon.jdbc.common.SStatement$BatchExecutionContext.doProcess(Unknown Source)
at com.amazon.jdbc.common.SStatement$BatchExecutionContext.processInt(Unknown Source)
at com.amazon.jdbc.common.SStatement.processBatchResults(Unknown Source)
at com.amazon.jdbc.common.SPreparedStatement.executeBatch(Unknown Source)
at io.confluent.connect.jdbc.sink.BufferedRecords.flush(BufferedRecords.java:138)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:66)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:75)
谢谢
汤姆