我在后台处理(sidekiq)中通过从S3网址下载图片然后上传&通过电子邮件发送下载ZIP链接。生产 EC2实例具有16GB RAM 。它成功获得了10 GB专辑并显示了buff / cache(10GB),但同样的流程会因专辑大小超过50GB而崩溃吗?如果是这样,那么最好的方法是什么呢?
以下是工作代码。需要一些建议。谢谢!
def self.generate_zip(album_id, user_id)
album = Album.find(album_id)
pictures = album.album_pictures
temp_dir = Rails.root.to_s + "/tmp/"
zip_path = File.open(temp_dir + "Album_#{album.name}_#{Date.today.to_s}.zip", "wb")
zip_stream = Zip::ZipOutputStream.write_buffer(zip_path) do |zip|
index = 0
pictures.each do |photo|
index += 1
zip.put_next_entry(index.to_s + "_" + photo.image_file_name)
data = open(photo.image_url)
IO.copy_stream(data, zip)
end
end
zip_stream.fsync # flush any buffered data to disk
zip_stream.rewind
#Upload the ZIP to S3
user = User.find(user_id)
export = user.exports.build
export.item_id = object.id
export.item_type = object.class.name
export.archive = File.open(zip_path)
export.save!
export
ensure
FileUtils.rm zip_path if File.exists?(zip_path)
end