保存到存储桶时,AWS Lambda中的S3 PutObject(通过节点)将文件大小加倍

时间:2016-12-27 19:40:19

标签: amazon-web-services amazon-s3 aws-lambda

我一直在使用http.get和s3.putObject。基本上,只是想从http位置获取文件并将其保存到S3中的存储桶中。看起来很简单。原始文件大小为47kb。

问题是,检索到的文件(47kb)被保存到S3存储桶(使用s3.putObject),大小为92.4kb。在某处,文件的大小增加了一倍,使其无法使用。

如何在文件保存到S3存储桶时防止文件大小翻倍?

以下是使用的完整代码:

exports.handler = function(event, context) {
    var imgSourceURL = "http://www.asite.com/an-image.jpg";
    var body;
    var stagingparams;
    http.get(imgSourceURL, function(res) {
        res.on('data', function(chunk) { body += chunk; });
        res.on('end', function() {
            var tmp_contentType = res.headers['content-type']; // Reported as image/jpeg
            var tmp_contentLength = res.headers['content-length']; // The reported filesize is 50kb (the actual filesize on disk is 47kb)
            stagingparams = {
                Bucket: "myspecialbucket",
                Key: "mytestimage.jpg",
                Body: body
            };
            // When putObject saves the file to S3, it doubles the size of the file to 92.4kb, thus making file non-readable.
            s3.putObject(stagingparams, function(err, data) {
                if (err) {
                    console.error(err, err.stack);
                }
                else {
                    console.log(data);
                }
            });
        });
    });
};

1 个答案:

答案 0 :(得分:1)

使用数组存储可读的流字节,然后在调用s3.putObject之前将数组中的所有缓冲区实例连接在一起:

exports.handler = function(event, context) {
    var imgSourceURL = "http://www.asite.com/an-image.jpg";
    var body = [];
    var stagingparams;
    http.get(imgSourceURL, function(res) {
        res.on('data', function(chunk) { body.push(chunk); });
        res.on('end', function() {
            var tmp_contentType = res.headers['content-type']; // Reported as image/jpeg
            var tmp_contentLength = res.headers['content-length']; // The reported filesize is 50kb (the actual filesize on disk is 47kb)
            stagingparams = {
                Bucket: "myspecialbucket",
                Key: "mytestimage.jpg",
                Body: Buffer.concat(body)
            };
            // When putObject saves the file to S3, it doubles the size of the file to 92.4kb, thus making file non-readable.
            s3.putObject(stagingparams, function(err, data) {
                if (err) {
                    console.error(err, err.stack);
                }
                else {
                    console.log(data);
                }
            });
        });
    });
};