批量输入0 INSERT INTO表错误。可能是什么问题?

时间:2016-02-17 10:33:09

标签: postgresql amazon-web-services amazon-s3 amazon-rds amazon-data-pipeline

现在我正在尝试在Postgres中实现S3到RDS之间的数据管道。在S3中,文件采用csv格式。当复制活动运行时,它会抛出错误,如下所示

errorStackTrace
amazonaws.datapipeline.taskrunner.TaskExecutionException: Error copying record at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.processAll(SingleThreadedCopyActivity.java:65) 
at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.runActivity(SingleThreadedCopyActivity.java:35) 
at amazonaws.datapipeline.activity.CopyActivity.runActivity(CopyActivity.java:22) 
at amazonaws.datapipeline.objects.AbstractActivity.run(AbstractActivity.java:16)
at amazonaws.datapipeline.taskrunner.TaskPoller.executeRemoteRunner(TaskPoller.java:136) 
at amazonaws.datapipeline.taskrunner.TaskPoller.executeTask(TaskPoller.java:105) 
at amazonaws.datapipeline.taskrunner.TaskPoller$1.run(TaskPoller.java:81) 
at private.com.amazonaws.services.datapipeline.poller.PollWorker.executeWork(PollWorker.java:76) 
at private.com.amazonaws.services.datapipeline.poller.PollWorker.run(PollWorker.java:53) 
at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: 
java.sql.BatchUpdateException: Batch entry 0 

我已经看过其他解决方案,但这些解决方案都没有真正提供解决方案。你能帮我吗?