通过boto3同步两个存储桶

时间:2018-11-28 10:41:27

标签: python amazon-web-services amazon-s3 boto3

是否可以使用boto3在两个不同的存储桶(源存储桶和目标存储桶)中循环存储桶内容,并且如果在源中找到与目标存储桶不匹配的任何密钥,则会将其上载到目标存储桶。请注意,我不想使用AWS S3同步。我目前正在使用以下代码来完成这项工作:

import boto3

s3 = boto3.resource('s3')
src = s3.Bucket('sourcenabcap')
dst = s3.Bucket('destinationnabcap')
objs = list(dst.objects.all())
for k in src.objects.all():
 if (k.key !=objs[0].key):
  # copy the k.key to target

4 个答案:

答案 0 :(得分:3)

如果您决定不使用boto3。 boto3仍然无法使用sync命令,因此您可以直接使用

# python 3

import os

sync_command = f"aws s3 sync s3://source-bucket/ s3://destination-bucket/"
os.system(sync_command)

答案 1 :(得分:1)

如果您只想按键进行比较(忽略对象之间的差异),则可以使用类似以下内容的方法:

s3 = boto3.resource('s3')
source_bucket = s3.Bucket('source')
destination_bucket = s3.Bucket('destination')
destination_keys = [object.key for object in destination_bucket.objects.all()]
for object in source_bucket.objects.all():
  if (object.key not in destination_keys):
    # copy object.key to destination

答案 2 :(得分:0)

我刚刚为此实现了一个简单的类(将本地文件夹同步到存储桶)。我将其发布在此处,希望对遇到相同问题的任何人有所帮助。

您可以修改S3Sync.sync以便考虑文件大小。

class S3Sync:
    """
    Class that holds the operations needed for synchronize local dirs to a given bucket.
    """

    def __init__(self):
        self._s3 = boto3.client('s3')

    def sync(self, source: str, dest: str) -> [str]:
        """
        Sync source to dest, this means that all elements existing in
        source that not exists in dest will be copied to dest.

        No element will be deleted.

        :param source: Source folder.
        :param dest: Destination folder.

        :return: None
        """

        paths = self.list_source_objects(source_folder=source)
        objects = self.list_bucket_objects(dest)

        # Getting the keys and ordering to perform binary search
        # each time we want to check if any paths is already there.
        object_keys = [obj['Key'] for obj in objects]
        object_keys.sort()
        object_keys_length = len(object_keys)

        for path in paths:
            # Binary search.
            index = bisect_left(object_keys, path)
            if index == object_keys_length:
                # If path not found in object_keys, it has to be sync-ed.
                self._s3.upload_file(str(Path(source).joinpath(path)),  Bucket=dest, Key=path)

    def list_bucket_objects(self, bucket: str) -> [dict]:
        """
        List all objects for the given bucket.

        :param bucket: Bucket name.
        :return: A [dict] containing the elements in the bucket.

        Example of a single object.

        {
            'Key': 'example/example.txt',
            'LastModified': datetime.datetime(2019, 7, 4, 13, 50, 34, 893000, tzinfo=tzutc()),
            'ETag': '"b11564415be7f58435013b414a59ae5c"',
            'Size': 115280,
            'StorageClass': 'STANDARD',
            'Owner': {
                'DisplayName': 'webfile',
                'ID': '75aa57f09aa0c8caeab4f8c24e99d10f8e7faeebf76c078efc7c6caea54ba06a'
            }
        }

        """
        try:
            contents = self._s3.list_objects(Bucket=bucket)['Contents']
        except KeyError:
            # No Contents Key, empty bucket.
            return []
        else:
            return contents

    @staticmethod
    def list_source_objects(source_folder: str) -> [str]:
        """
        :param source_folder:  Root folder for resources you want to list.
        :return: A [str] containing relative names of the files.

        Example:

            /tmp
                - example
                    - file_1.txt
                    - some_folder
                        - file_2.txt

            >>> sync.list_source_objects("/tmp/example")
            ['file_1.txt', 'some_folder/file_2.txt']

        """

        path = Path(source_folder)

        paths = []

        for file_path in path.rglob("*"):
            if file_path.is_dir():
                continue
            str_file_path = str(file_path)
            str_file_path = str_file_path.replace(f'{str(path)}/', "")
            paths.append(str_file_path)

        return paths


if __name__ == '__main__':
    sync = S3Sync()
    sync.sync("/temp/some_folder", "some_bucket_name")

答案 3 :(得分:0)

  1. 获取目标帐户 ID DEST_ACCOUNT_ID

  2. 创建源存储桶并添加此策略

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "DelegateS3Access",
                "Effect": "Allow",
                "Principal": {
                    "AWS": "arn:aws:iam::DEST_ACCOUNT_ID:root"
                },
                "Action": [
                    "s3:ListBucket",
                    "s3:GetObject"
                ],
                "Resource": [
                    "arn:aws:s3:::s3-copy-test/*",
                    "arn:aws:s3:::s3-copy-test"
                ]
            }
        ]
    }

  1. 创建要复制的文件

  2. 在目标账户上创建用户并使用该用户配置 AWS CLI

  3. 在目标账户上创建目标存储桶

  4. 将此策略附加到目标账户上的 IAM 用户

     {
     "Version": "2012-10-17",
     "Statement": [
         {
             "Effect": "Allow",
             "Action": [
                 "s3:ListBucket",
                 "s3:GetObject"
             ],
             "Resource": [
                 "arn:aws:s3:::s3-copy-test",
                 "arn:aws:s3:::s3-copy-test/*"
             ]
         },
         {
             "Effect": "Allow",
             "Action": [
                 "s3:ListBucket",
                 "s3:PutObject",
                 "s3:PutObjectAcl"
             ],
             "Resource": [
                 "arn:aws:s3:::s3-copy-test-dest",
                 "arn:aws:s3:::s3-copy-test-dest/*"
             ]
         }
     ]
    

    }

  5. 执行文件同步

<块引用>
aws s3 sync s3://s3-copy-test s3://s3-copy-test-dest --source-region eu-west-1 --region eu-west-1