我正在使用Amazon kinesis firehose传输流将我的数据复制到redshift。我使用dynamoDB流作为数据源,使用lambda-streams-to-firehose lambda函数将数据复制到传递流。执行完这个lambda后,我得到了以下输出。
{ invocationId: '0b214ec1-6b67-4c78-8881-9b3998555205', deliveryStreamArn: 'arn:aws:firehose:us-east-1:xxxxxxxx:deliverystream/<streamName>', region: 'us-east-1', records: [ { recordId: '49575469680524135041586805764649280618633657491608567810', approximateArrivalTimestamp: 1501146562192, data: 'eyJLZXlzIjp7IkRldmljZUlEIjp7IlMiOiJEQVZJUy1NLTIwLVcifSwiVGltZXN0YW1wIjp7IlMiOiIxNTAxMTQ2NTYwODI5In19LCJOZXdJbWFnZSI6eyJUZW1wZXJhdHVyZSI6eyJTIjoiNjMuMTM5OTk5OTk5OTk5OTkifSwiRGV2aWNlSUQiOnsiUyI6IkRBVklTLU0tMjAtVyJ9LCJQcmVzc3VyZSI6eyJTIjoiMTMyLjg0In0sIlRpbWVzdGFtcCI6eyJTIjoiMTUwMTE0NjU2MDgyOSJ9fSwiU2VxdWVuY2VOdW1iZXIiOiI0MDIxMTAyMDAwMDAwMDAwMDI0MDI0MDA1NzgiLCJTaXplQnl0ZXMiOjEyNiwiQXBwcm94aW1hdGVDcmVhdGlvbkRhdGVUaW1lIjoxNTAxMTQ2NTQwLCJldmVudE5hbWETUiOiJJTlNFUlQifQo=' }, { recordId: '49575469680524135041586805770929650251531656329085059074', approximateArrivalTimestamp: 1501146564204, data: 'eyJLZXlzIjp7IkRldmljZUlEIjp7IlMiOiJCSVo0SU5URUxMSUEtTElCMDIifSwiVGltZXN0YW1wIjp7IlMiOiIxNTAxMTQ2NTYzMTg4In19LCJOZXdJbWFnZSI6eyJDb2xpZm9ybUJhY3RlcmlhIjp7IlMiOiIzNiJ9LCJDeWFub0JhY3RlcmlhIjp7IlMiOiIyMDg0MSJ9LCJUZW1wZXJhdHVyZSI6eyJTIjoiODAifSwiRGV2aWNlSUQiOnsiUyI6IkJJWjRJTlRFTExJQS1MSUIwMiJ9LCJBbGthbGluaXR5Ijp7IlMiOiIyMzUifSwiVGltZXN0YW1wIjp7IlMiOiIxNTAxMTQ2NTYzMTg4In0sIkRlcHRoIjp7IlMiOiIyMCJ9LCJFQyI6eyJTIjoiMCJ9fSwiU2VxdWVuY2VOdW1iZXIiOiI0MDIxMTAzMDAwMDAwMDAwMDI0MDI0MDE1ODciLCJTaXplQnl0ZXMiOjE2OCwiQXBwcm94aW1hdGVDcmVhdGlvbkRhdGVUaW1lIjoxNTAxMTQ2NTQwLCJldmVudE5hbWUiOiJJTlNFUlQifQo=' } ] }
在将数据存储到S3之前,我已经为数据转换配置了另一个lambda,这为我提供了以下输出。
[ { recordId: '49575469680524135041586805764649280618633657491608567810', result: 'Ok', data: 'eyJLZXlzIjp7IkRldmljZUlEIjp7IlMiOiJEQVZJUy1NLTIwLVcifSwiVGltZXN0YW1wIjoiMDcuMjcuMjAxNyAwOTowOToyMCJ9LCJOZXdJbWFnZSI6eyJUZW1wZXJhdHVyZSI6eyJTIjoiNjMuMTM5OTk5OTk5OTk5OTkifSwiRGV2aWNlSUQiOnsiUyI6IkRBVklTLU0tMjAtVyJ9LCJQcmVzc3VyZSI6eyJTIjoiMTMyLjg0In0sIlRpbWVzdGFtcCI6IjA3LjI3LjIwMTcgMDk6MDk6MjAifSwiU2VxdWVuY2VOdW1iZXIiOiI0MDIxMTAyMDAwMDAwMDAwMDI0MDI0MDA1NzgiLCJTaXplQnl0ZXMiOjEyNiwiQXBwcm94aW1hdGVDcmVhdGlvbkRhdGVUaW1lIjoxNTAxMTQ2NTQwLCJldmVudE5hbWUiOiJJTlNFUlQifQ==' }, { recordId: '49575469680524135041586805770929650251531656329085059074', result: 'Ok', data: 'eyJLZXlzIjp7IkRldmljZUlEIjp7IlMiOiJCSVo0SU5URUxMSUEtTElCMDIifSwiVGltZXN0YW1wIjoiMDcuMjcuMjAxNyAwOTowOToyMyJ9LCJOZXdJbWFnZSI6eyJDb2xpZm9ybUJhY3RlcmlhIjp7IlMiOiIzNiJ9LCJDeWFub0JhY3RlcmlhIjp7IlMiOiIyMDg0MSJ9LCJUZW1wZXJhdHVyZSI6eyJTIjoiODAifSwiRGV2aWNlSUQiOnsiUyI6IkJJWjRJTlRFTExJQS1MSUIwMiJ9LCJBbGthbGluaXR5Ijp7IlMiOiIyMzUifSwiVGltZXN0YW1wIjoiMDcuMjcuMjAxNyAwOTowOToyMyIsIkRlcHRoIjp7IlMiOiIyMCJ9LCJFQyI6eyJTIjoiMCJ9fSwiU2VxdWVuY2VOdW1iZXIiOiI0MDIxMTAzMDAwMDAwMDAwMDI0MDI0MDE1ODciLCJTaXplQnl0ZXMiOjE2OCwiQXBwcm94aW1hdGVDcmVhdGlvbkRhdGVUaW1lIjoxNTAxMTQ2NTQwLCJldmVudE5hbWUiOiJJTlNFUlQifQ==' } ]
现在,当我查看上面的输出时,我可以看到逗号在两个记录之间可用,但当它作为对象存储到S3时,对象之间有逗号。可能是因为我在红移中遇到错误的原因。任何人都可以告诉我在lambda或配送交付流中我缺少什么。
答案 0 :(得分:0)
在您的firehose投放流中,您是否添加了 Redshift COPY选项 - DELIMITER','?
当您的redshift COPY选项缺少DELIMITER参数时,会出现上述错误(未找到分隔符/字符串长度超过DDL长度)。
如果我做了一个错误的假设请发表评论,我会重新调整我的答案。