我正在尝试使用GS Python库中提供的代码示例从Google Cloud Storage下载大文件(2.5GB)。这适用于较小的文件(我已经测试了1-2KB的文件)。我在Windows 7上使用Python 2.7.5。
dest_dir = c:\\downloadfolder
networkbucket = bucketname
uri = boto.storage_uri(networkbucket,'gs')
for obj in uri.get_bucket():
print obj.name
name=str(obj.name)
local_dst_uri = boto.storage_uri(os.path.join(dest_dir, name),'file')
object_contents = StringIO.StringIO()
src_uri = boto.storage_uri(networkbucket + '/' + name, 'gs')
src_uri.get_key().get_file(object_contents)
object_contents.seek(0)
local_dst_uri.new_key().set_contents_from_file(object_contents)
object_contents.close()
我收到内存错误:
Traceback (most recent call last):
File "C:\folder\GS_Transfer.py", line 52, in <module>
src_uri.get_key().get_file(object_contents)
File "C:\gsutil\third_party\boto\boto\gs\key.py", line 165, in get_file
query_args=query_args)
File "C:\gsutil\third_party\boto\boto\s3\key.py", line 1455, in _get_file_internal
for bytes in self:
File "C:\gsutil\third_party\boto\boto\s3\key.py", line 364, in next
data = self.resp.read(self.BufferSize)
File "C:\gsutil\third_party\boto\boto\connection.py", line 414, in read
return httplib.HTTPResponse.read(self, amt)
File "C:\Python27\lib\httplib.py", line 567, in read
s = self.fp.read(amt)
File "C:\Python27\lib\socket.py", line 400, in read
buf.write(data)
MemoryError: out of memory
我可以通过命令行使用gsutil.py cp下载文件ok。不知道怎么做才能修改这段代码呢?我一直试图找到一种方式下载部分但不确定如何。
答案 0 :(得分:1)
问题是你用StringIO
将整个对象内容读入内存。您可以使用此处的KeyFile
类来代替:
from boto.s3.keyfile import KeyFile
使用它代替StringIO
:
local_dst_uri = boto.storage_uri(os.path.join(dest_dir, name),'file')
src_uri = boto.storage_uri(networkbucket + '/' + name, 'gs')
keyfile = KeyFile(src_uri.get_key())
local_dst_uri.new_key().set_contents_from_file(keyfile)