dynamodb会限制put_item操作的qps吗?
我尝试使用aiobotocore将一些记录从json文件迁移到dynamodb,这是boto3的异步包装。
通过使用方法put_item,很难达到吞吐量。
将put_item中的方法更改为batch_write_item之后,很容易达到写入容量。
# Codes like this
import aiobotocore
import asyncio
resources = iter([])
throughput = 1000
loop = asyncio.get_event_loop()
dydb = aiobotocore.get_session().create_client('dynamodb')
async def put_item(cli, item, table_name):
await cli.put_item(TableName=table_name, Item=item)
async def batch_write_item(cli, items):
await cli.batch_write_item(RequestItems=items)
async def test_put_item(resource, table_name):
for item in resource:
await put_item(dydb, item, table_name)
async def test_batch_write_item(resource):
batch = []
for item in resource:
if len(batch) > 24:
await batch_write_item(dydb, batch)
batch = []
else:
batch.append(item)
if __name__ == "__main__":
# put item
tasks = [test_put_item(resource) for _ in range(throughput)]
loop.run_until_complete(asyncio.gather(*tasks))
# batch write
# tasks = [test_batch_write_item(resource) for _ in range(throughput // 25)]
# loop.run_until_complete(*tasks)