我有以下脚本尝试将备份文件从我的实例vm
上传到存储桶。
执行脚本时,它只上传大小最小为30 GB的文件,但是它不会将大于> = 80 GB的文件加载到存储桶中。
python 2>/dev/null - <<EOF
import os
import socket
import logging
from datetime import datetime
from google.cloud import storage
storage_client = storage.Client()
today = datetime.today()
current_hour = today.strftime('%Y/%m/%d/%H')
hostname = socket.gethostname()
bucket = storage_client.get_bucket("fsa_backup")
for subdir, dirs, files in os.walk('/hanabackup/log/DB_FRD/'):
for file in files:
backupfilename = os.path.join(subdir, file)
if 'log_backup' in backupfilename:
only_filename = backupfilename.split('/')[-1]
backup_file = hostname + '/log/' + only_filename
blob = bucket.blob(backup_file)
blob.upload_from_filename(filename=backupfilename)
for subdir, dirs, files in os.walk('/hanabackup/data/DB_FRD/'):
for file in files:
backupfilename = os.path.join(subdir, file)
if 'COMPLETE_DATA_BACKUP' in backupfilename:
only_filename = backupfilename.split('/')[-1]
backup_file = hostname + '/data/' + only_filename
blob = bucket.blob(backup_file)
try:
blob.upload_from_filename(filename=backupfilename)
except Exception as e:
logging.fatal(e, exc_info=True)
EOF
答案 0 :(得分:2)
对于大文件,上传文件时网络中断或其他障碍的风险较高。对于此类情况it is recommended,可以使用可恢复媒体上传作为MediaFileUpload
方法中的参数。它将以块的形式上传您的文件。在这个GitHub code snippet中,您可以看到它是如何完成的。