在Python / Boto 3中,发现从S3到本地单独下载文件可以执行以下操作:
bucket = self._aws_connection.get_bucket(aws_bucketname)
for s3_file in bucket.list():
if filename == s3_file.name:
self._downloadFile(s3_file, local_download_directory)
break;
并下载所选目录下的所有文件:
else:
bucket = self._aws_connection.get_bucket(aws_bucketname)
for s3_file in bucket.list():
self._downloadFile(s3_file, local_download_directory)
辅助函数_downloadFile()
:
def _downloadFile(self, s3_file, local_download_destination):
full_local_path = os.path.expanduser(os.path.join(local_download_destination, s3_file.name))
try:
print "Downloaded: %s" % (full_local_path)
s3_file.get_contents_to_filename(full_local_path)
但两者似乎都没有起作用。使用Boto 3和Python,希望能够在S3上定义的目录下将所有文件(最好是zip)下载到本地。
我可能做错了什么,参数的正确实施是什么?
提前感谢您,并一定会接受/ upvote回答
更新代码:Getting an error: “AttributeError: 'S3' object has no attribute
import sys
import json
import os
import subprocess
import boto3
from boto.s3.connection import S3Connection
s3 = boto3.resource('s3')
s3client = boto3.client('s3')
#This works
for bucket in s3.buckets.all():
print(bucket.name)
def main():
#Getting an error: “AttributeError: 'S3' object has no attribute 'download’”
s3client.download('testbucket', 'fileone.json', 'newfile')
if __name__ == "__main__": main()
答案 0 :(得分:5)
要将文件从S3下载到本地FS,请使用download_file()
方法
cat /root/.mysql_secret
如果S3对象是s3client = boto3.client('s3')
s3client.download_file(Bucket, Key, Filename)
,那么参数将是
s3://mybucket/foo/bar/file.txt
没有任何方法可以下载整个存储桶。另一种方法是列出存储桶中的所有对象,并将它们作为文件单独下载。
Bucket --> mybucket
Key --> foo/bar/file.txt
Filename --> /local/path/file.txt
注意: list_objects()
的响应被截断为1000个对象。使用响应中的标记来检索存储桶中的其余对象。