我正在尝试使用CSV文件将数据导入DynamoDB表,并且连接和所有内容均正常运行,但是,我在第16-17行中连续收到语法错误。在CSV文件中,列出的对象与代码中的变量完全相同。 (我有2列以上,大约415行,所以这很令人沮丧)
我尝试删除多余的行以将其限制为1,但仍然在'name':row.split(',')[0]上出错。
import boto3
s3 = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')
def csv_reader(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']
obj = s3.get_object(Bucket=bucket, Key=key)
rows = obj['Body'].read().split('\n')
table = dynamodb.Table('ServiceOfferings')
with table.batch_writer() as batch:
for row in rows:
batch.put_item(Item={
'name':row.split(',')[0],
'business_criticality':row.split(',')[1]
})
An error occurred (ValidationException) when calling the BatchWriteItem
operation: The provided key element does not match the schema:
ClientError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 17, in csv_reader
'business_criticality':row.split(',')[1]
File "/var/runtime/boto3/dynamodb/table.py", line 101, in put_item
self._add_request_and_process({'PutRequest': {'Item': Item}})
File "/var/runtime/boto3/dynamodb/table.py", line 110, in
_add_request_and_process
self._flush_if_needed()
File "/var/runtime/boto3/dynamodb/table.py", line 131, in
_flush_if_needed
self._flush()
File "/var/runtime/boto3/dynamodb/table.py", line 137, in _flush
RequestItems={self._table_name: items_to_send})
File "/var/runtime/botocore/client.py", line 357, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/var/runtime/botocore/client.py", line 661, in _make_api_call
raise error_class(parsed_response, operation_name)
ClientError: An error occurred (ValidationException) when calling the
BatchWriteItem operation: The provided key element does not match the
schema
结果应将信息应用于DynamoDB表,该表具有5列左右的行和415行。