从应用引擎

时间:2017-07-13 11:43:51

标签: python-2.7 google-app-engine google-cloud-platform google-cloud-storage

我可以使用谷歌应用引擎中的以下代码将小文件写入谷歌云存储。当我尝试编写更大的文件获取内存异常时。

思考可能是先编写整个文件内存然后写入谷歌云存储。有没有什么办法我可以通过chunk下载chunk并写入大块的google云存储块。或者这个问题完全不同?

def _writeFilesinGCS(filename, data):
  ### Initializing Google cloud Storage Object
  print "In _writeFilesinGCS function"
  tmp_filenames_to_clean_up = []
  write_retry_params = _gcs.RetryParams(backoff_factor=1.1)
  gcs_file=_gcs.open(filename, 'w', content_type='text/plain',retry_params=write_retry_params)
  gcs_file.write(data.encode('utf-8'))
  gcs_file.close()
  tmp_filenames_to_clean_up.append(filename)

def download_files(service, report_run_id, report_fragment,loaddate,file_name,cfg):
  """Generate and print sample report.

  Args:
    service: An authorized Doublelcicksearch service.
    report_id: The ID DS has assigned to a report.
    report_fragment: The 0-based index of the file fragment from the files array.
  """
  bucket_name = cfg._gcsbucket
  bucket = '/' + bucket_name
  filename = bucket + '/' + file_name + "_"+ loaddate + "_MMA_" + loaddate + ".csv"
  print "Enter into download_files", report_run_id
  request = service.reports().getFile(reportId=report_run_id, reportFragment=report_fragment)
  _writeFilesinGCS(filename,request.execute())
  dsbqfuns._dsbqinsert(report_run_id,cfg,file_name,1)

0 个答案:

没有答案