如何通过Python boto模块在启用了网站主机的S3帐户中设置文件的内容类型?
我在做:
from boto.s3.connection import S3Connection
from boto.s3.key import Key
from boto.cloudfront import CloudFrontConnection
conn = S3Connection(access_key_id, secret_access_key)
bucket = conn.create_bucket('mybucket')
b = conn.get_bucket(bucket)
b.set_acl('public-read')
fn = 'index.html'
template = '<html>blah</html>'
k = Key(b)
k.key = fn
k.set_contents_from_string(template)
k.set_acl('public-read')
k.set_metadata('Content-Type', 'text/html')
但是,当我从http://mybucket.s3-website-us-east-1.amazonaws.com/index.html访问它时,我的浏览器会提示我下载该文件,而不是简单地将其作为网页提供。
查看S3 Management控制台中的元数据显示Content-Type
实际上已设置为“application / octet-stream”。如果我在控制台中手动更改它,我可以正常访问该页面,但如果我再次运行我的脚本,它会将其重置为错误的内容类型。
我做错了什么?
答案 0 :(得分:17)
set_metadata
方法实际上是用于在S3对象上设置用户元数据。许多标准HTTP元数据字段具有表示它们的第一类属性,例如, content_type
。此外,您希望在将对象实际发送到S3之前设置元数据。这样的事情应该有效:
import boto
conn = boto.connect_s3()
bucket = conn.get_bucket('mybucket') # Assumes bucket already exists
key = bucket.new_key('mykey')
key.content_type = 'text/html'
key.set_contents_from_string(mystring, policy='public-read')
请注意,您可以在将对象写入S3时设置预设ACL策略,从而节省了进行其他API调用的时间。
答案 1 :(得分:2)
对于需要单行的人来说,
import boto3
s3 = boto3.resource('s3')
s3.Bucket('bucketName').put_object(Key='keyName', Body='content or fileData', ContentType='contentType', ACL='check below')
支持的ACL值:
'private'|'public-read'|'public-read-write'|'authenticated-read'|'aws-exec-read'|'bucket-owner-read'|'bucket-owner-full-control'
put_object
支持的参数可在此处找到,https://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object
答案 2 :(得分:0)
I wasn't able to get the above solution to actually persist my metadata changes.
Perhaps because I was using a file and it was resetting the content type using mimetype? Also I am uploading m3u8 and ts files for HLS encoding so that could interfere as well.
Anyway, here's what worked for me.
import boto
conn = boto.connect_s3()
bucket = conn.get_bucket('mybucket')
key_m3u8 = Key(bucket_handle)
key_m3u8.key = s3folder+"/"+s3keyname
key_m3u8.metadata = {"Content-Type":"application/x-mpegURL","Cache-Control":"public,max-age=8"}
key_m3u8.set_contents_from_filename("path_to_my_file", policy="public-read")
答案 3 :(得分:0)
如果您使用AWS S3 Bitbucket Pipelines Python,请添加参数 content_type :
s3_upload.py
def upload_to_s3(bucket, artefact, bucket_key, content_type):
...
def main():
...
parser.add_argument("content_type", help="Content Type File")
...
if not upload_to_s3(args.bucket, args.artefact, args.bucket_key, args.content_type):
然后按如下所示修改 bitbucket-pipelines.yml :
...
- python s3_upload.py bucket_name file key content_type
...
其中 content_type 参数可以是以下之一:MIME types (IANA media types)