在Google Cloud Storage Bucket中压缩目录,然后在本地目录中下载

时间:2018-06-30 21:23:44

标签: google-cloud-platform google-cloud-storage gsutil

我在Google云存储桶bar中有一个名为foo的目录。目录bar中大约有100万个小文件(每个文件约1-2 kb)。

根据this reference,如果我有大量文件,则应使用gsutil -m选项下载文件,如下所示:

gsutil -m cp -r gs://foo/bar/ /home/username/local_dir

但是考虑到文件总数(大约10 ^ 6),下载文件的整个过程仍然很慢。

有没有一种方法可以压缩云存储中的整个目录,然后将压缩后的目录下载到本地文件夹?

1 个答案:

答案 0 :(得分:1)

在复制之前无法在云中压缩目录,但是您可以通过在多台计算机之间分布处理来加快复制速度。例如,有这样的脚本

machine1做 function index(){ $data['songs'] = $this->m_admin_data->Tampil_song_admin()->result(); $this->load->view('admin/v_admin_song',$data); } //Tambah Data function add_song(){ $this->load->view('admin/v_admin_song_add'); } function proses_tambah(){ $id_album = $this->input->post('id_album'); $song = $this->input->post('song'); $author = $this->input->post('author'); $composer = $this->input->post('composer'); $data = array( 'id_album' => $id_album, 'song' => $song, 'author' => $author, 'composer' => $composer ); $this->m_admin_data->Input_song($data,'songs'); redirect('Admin_song/index'); } //Update/Edit Data function edit_song($id){ $where = array('id_song' => $id); $data['songs'] = $this->m_admin_data->edit_song($where,'songs')->result(); $this->load->view('admin/v_admin_song_edit',$data); } function update_song(){ $id = $this->input->post('id_song'); $id_album = $this->input->post('id_album'); $song = $this->input->post('song'); $author = $this->input->post('author'); $composer = $this->input->post('composer'); $data = array( 'id_album' => $id_album, 'song' => $song, 'author' => $author, 'composer' => $composer ); $where = array( 'id_song' => $id ); $this->m_admin_data->update_song($where,$data,'songs'); redirect('Admin_song/index'); } //Hapus Data function delete_song($id){ $where = array('id_song' => $id); $this->m_admin_data->delete_song($where,'songs'); redirect('Admin_song/index'); }

machine2做gsutil -m cp -r gs://<bucket>/a* local_dir

根据文件的命名方式,您可能需要对上述内容进行调整,但希望您能理解。