我正在尝试使用AWS Database Migration Service将测试csv文件从S3传输到DynamoDB表。我是aws的新手,所以如果我做错了什么,请原谅我。
我已经毫无问题地创建并测试了源和目标端点。但是,我遇到了一些任务定义错误(虽然不知道为什么,但是我的日志没有出现在CloudWatch中)。
为简单起见,我的测试源S3文件只有一列:eventId
。路径如下:s3://myBucket/testFolder/events/events.csv
这是JSON外部定义文件:
{
"TableCount": "1",
"Tables": [
{
"TableName": "events",
"TablePath": "testFolder/events/",
"TableOwner": "testFolder",
"TableColumns": [
{
"ColumnName": "eventId",
"ColumnType": "STRING",
"ColumnNullable": "false",
"ColumnIsPk": "true",
"ColumnLength": "10",
}
],
"TableColumnsTotal": "1"
}
]
}
这是我的任务定义:
"rules": [
{
"rule-type": "selection",
"rule-id": "1",
"rule-name": "1",
"object-locator": {
"schema-name": "testFolder",
"table-name": "events"
},
"rule-action": "include"
},
{
"rule-type": "object-mapping",
"rule-id": "2",
"rule-name": "2",
"rule-action": "map-record-to-record",
"object-locator": {
"schema-name": "testFolder",
"table-name": "tableName"
},
"target-table-name": "myTestDynamoDBTable",
"mapping-parameters": {
"partition-key-name": "eventId",
"attribute-mappings": [
{
"target-attribute-name": "eventId",
"attribute-type": "scalar",
"attribute-sub-type": "string",
"value": "${eventId}"
}
]
}
}
]
}
每次,我的任务都出错。我对模式特别困惑,因为我的源文件在S3中,所以我认为那里不需要模式吗?我在AWS文档中找到以下行:
s3://mybucket/hr/employee. At load time, AWS DMS assumes that the source schema name is hr...
->那么我应该在hr文件夹中包含某种架构文件吗?
抱歉,如果这是错误的,我将不胜感激。谢谢。