使用boto3上传分段

时间:2018-03-23 07:26:03

标签: python amazon-web-services boto3

我正在使用boto3跟踪以下文档以进行分段上传,但无法执行相同操作。 你可以告诉我相同的概念和语法吗?

http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.create_multipart_upload

http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.complete_multipart_upload

1 个答案:

答案 0 :(得分:2)

来自this文件:

  

使用传输管理器

     

boto3 提供用于管理各种类型的转移的界面   S3。功能包括:

     

自动管理多部分和非分部上传

     

确保仅在绝对时发生分段上传   必要时,您可以使用 multipart_threshold 配置   参数:

使用以下python代码将文件上传到s3并管理自动分段上传。

import argparse
import boto3
import botocore
import os
import pandas as pd
from boto3.s3.transfer import TransferConfig

def environment_set(access_key,secret_access_key):
    os.environ["AWS_ACCESS_KEY_ID"] = access_key
    os.environ["AWS_SECRET_ACCESS_KEY"] = secret_access_key

def s3_upload_file(args):     
    while True:
    try:
        s3 = boto3.resource('s3')

        GB = 1024 ** 3

            # Ensure that multipart uploads only happen if the size of a transfer
            # is larger than S3's size limit for nonmultipart uploads, which is 5 GB.
            config = TransferConfig(multipart_threshold=5 * GB)

        s3.meta.client.upload_file(args.path, args.bucket, os.path.basename(args.path),Config=config)
        print "S3 Uploading successful"
        break
    except botocore.exceptions.EndpointConnectionError:
        print "Network Error: Please Check your Internet Connection"
    except Exception, e:
        print e


if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='UPLOAD A FILE TO PRE-EXISTING S3 BUCKET')
    parser.add_argument('path', metavar='PATH', type=str,
            help='Enter the Path to file to be uploaded to s3')
    parser.add_argument('bucket', metavar='BUCKET_NAME', type=str,
            help='Enter the name of the bucket to which file has to be uploaded')
    parser.add_argument('cred', metavar='CREDENTIALS', type=str,
            help='Enter the Path to credentials.csv, having AWS access key and secret access key')    
    args = parser.parse_args()
    df = pd.read_csv(args.cred, header=None)
    access_key = df.iloc[1,1]
    secret_access_key = df.iloc[1,2]
    environment_set(access_key,secret_access_key)
    s3_upload_file(args)