如何使用谷歌云Python API将目录复制到谷歌云存储?

时间:2018-01-30 06:09:04

标签: python python-3.x google-app-engine google-cloud-platform google-cloud-storage

以下功能非常适合将单个文件复制到Google云端存储。

#!/usr/bin/python3.5
import googleapiclient.discovery

from google.cloud import storage

def upload_blob(bucket_name, source_file_name, destination_blob_name, project):
  storage_client = storage.Client(project=project)
  bucket = storage_client.get_bucket(bucket_name)
  blob = bucket.blob(destination_blob_name)

blob.upload_from_filename(source_file_name)

print('File {} uploaded to {}.'.format(
    source_file_name,
    destination_blob_name))

现在我没有给出文件名,而是尝试输入目录名upload_blob('mybucket','/data/inputdata/', 'myapp/inputdata/','myapp') 但后来我得到了这个错误:

  

属性错误:' str'对象没有属性'读'

在调用函数blob.upload_from_file()复制目录时,是否需要提供任何其他参数?

2 个答案:

答案 0 :(得分:5)

一次上传多个文件不是API的内置功能。您可以在循环中复制多个文件,也可以使用命令行实用程序,它可以复制整个目录。

答案 1 :(得分:4)

您可以使用以下代码来完成此操作:

import os
import glob

def copy_local_directory_to_gcs(local_path, bucket, gcs_path):
    """Recursively copy a directory of files to GCS.

    local_path should be a directory and not have a trailing slash.
    """
    assert os.path.isdir(local_path)
    for local_file in glob.glob(local_path + '/**'):
        if not os.path.isfile(local_file):
            continue
        remote_path = os.path.join(gcs_path, local_file[1 + len(local_path) :])
        blob = bucket.blob(remote_path)
        blob.upload_from_filename(local_file)

像这样使用它:

copy_local_directory_to_gcs('path/to/foo', bucket, 'remote/path/to/foo')