我正在尝试从API下载文件以直接上传(流)到S3中。
我的本地下载代码(效果很好):
import requests
import datetime
import os
headers = {'Authorization': 'apikey THISISHIDDEN'}
baseURL = 'https://api.test.au/busschedule/'
target_path = datetime.datetime.now().strftime('%Y-%m-%d schedule') + '.zip'
response = requests.get(baseURL, stream=True, headers=headers)
handle = open(target_path, "wb")
for chunk in response.iter_content(chunk_size=512):
if chunk: # filter out keep-alive new chunks
handle.write(chunk)
handle.close()
我尝试下载并流式传输到S3(无效):
# import requests
import datetime
import os
import boto3
import botocore.vendored.requests.packages.urllib3 as urllib3
# Get environment variables from serverless.yml
bucket = "bucket"
s3folder = "schedules"
# Set standard script parameters
headers = {'Authorization': 'apikey THISISHIDDEN'}
baseURL = 'https://api.test.au/busschedule/'
def run(event, context):
s3 = boto3.client('s3')
datetimestamp = datetime.datetime.today().strftime('%Y%m%dT%H%M%S')
filename = datetimestamp + " bus schedule.zip"
key = s3folder + '/' + filename # your desired s3 path or filename
http = urllib3.PoolManager()
s3.upload_fileobj(http.request('GET', baseURL,
headers=headers, preload_content=False),
bucket, key)
def main():
run({},{})
if __name__ == "__main__":
exit(main())
CloudWatch返回的错误是:
InsecureRequestWarning: Unverified HTTPS request is being made. Timeout after 300.10s.
编辑:lambda函数的超时时间为300秒;但这应该足以下载文件(6mb)。在10秒钟左右即可完成本地下载。有人有更好的方法吗?
答案 0 :(得分:0)
使用“ smart_open”库解决了此问题:
response = requests.get(baseURL, stream=True, headers=headers)
s3url = 's3://' + bucket + '/' + key
with smart_open(s3url, 'wb') as fout:
fout.write(response.content)
我还有另一个问题要解决(Lambda权限),但这将是一个单独的问题。在本地运行此功能可以解决问题。