我无法找到任何解决方案,在python中使用boto在s3存储桶中将内容从一个复制到另一个。
假设一个桶B1包含的密钥结构如下: B1 / X / * 我想以递归方式将所有对象从B / x / *键复制到B / y / *
答案 0 :(得分:4)
没有"目录"在S3。那些" /" separator只是对象名称的一部分,这就是为什么boto没有这样的功能。编写脚本来处理它或使用第三方工具。
AWS customerapps显示提供此类任意目录复制功能的s3browser。典型的免费版本只产生两个线程来移动文件,付费版本允许您指定更多线程并运行得更快。
或者您只需编写脚本并使用s3.client.copy_object将文件复制到另一个名称,然后再删除它们。例如
import boto3
s3 = boto3.client("s3")
# list_objects_v2() give more info
more_objects=True
found_token = True
while more_objects :
if found_token :
response= s3.list_objects_v2(
Bucket="mybucket",
Prefix="B1/x/",
Delimiter="/")
else:
response= s3.list_objects_v2(
Bucket="mybucket",
ContinuationToken=found_token,
Prefix="B1/x/",
Delimiter="/")
# use copy_object or copy_from
for source in object_list["Contents"]:
raw_name = source["Key"].split("/")[-1]
new_name = "new_structure/{}".format(raw_name)
s3.copy_object(
....
)
# Now check there is more objects to list
if "NextContinuationToken" in response:
found_token = response["NextContinuationToken"]
more_objects = True
else:
more_objects = False
**重要注意事项**:list_object每个列表只返回最多1000个密钥,MaxKey不会更改限制。所以你必须使用list_objects_v2并检查是否返回了NextContinuationToken,以确保它是更多的对象,重复它直到用尽。
答案 1 :(得分:1)
试图建立在之前的答案:
self.s3 = boto3.client('s3')
def copyFolderFromS3(self,pathFrom, bucketTo, locationTo):
response = {}
response['status'] = 'failed'
getBucket = path.split('/')[2]
location = '/'.join(path.split('/')[3:])
if path.startswith('s3://'):
copy_source = { 'Bucket': getBucket, 'Key': location }
uploadKey = locationTo
self.recursiveCopyFolderToS3(copy_source,bucketTo,uploadKey)
def recursiveCopyFolderToS3(self,src,uplB,uplK):
more_objects=True
found_token = True
while more_objects :
if found_token :
response = self.s3.list_objects_v2(
Bucket=src['Bucket'],
Prefix=src['Key'],
Delimiter="/")
else:
response = self.s3.list_objects_v2(
Bucket=src['Bucket'],
ContinuationToken=found_token,
Prefix=src['Key'],
Delimiter="/")
for source in response["Contents"]:
raw_name = source["Key"].split("/")[-1]
raw_name = raw_name
new_name = os.path.join(uplK,raw_name)
if raw_name.endswith('_$folder$'):
src["Key"] = source["Key"].replace('_$folder$','/')
new_name = new_name.replace('_$folder$','')
self.recursiveCopyFolderToS3(src,uplB,new_name)
else:
src['Key'] = source["Key"]
self.s3.copy_object(CopySource=src,Bucket=uplB,Key=new_name)
if "NextContinuationToken" in response:
found_token = response["NextContinuationToken"]
more_objects = True
else:
more_objects = False
或者你也使用简单的awscli,它默认安装在EC2 / emr机器上。
import subprocess
cmd='aws s3 cp '+path+' '+uploadUrl+' --recursive'
p=subprocess.Popen(cmd, shell=True,stdout=subprocess.PIPE)
p.communicate()