Node.js AWS S3 [RangeError:超出最大调用堆栈大小]

时间:2014-10-08 11:05:09

标签: node.js amazon-web-services amazon-s3

我在ubuntu 14.04.1上使用node.js v0.10.32并尝试使用aws-sdk(2.0.18)从S3上传(和下载)文件。但是在上传大文件时会出现以下错误,比如32MB。

(node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
...
(node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
RangeError: Maximum call stack size exceeded

我尝试使用node --stack-size=16384 ...使堆栈大小更大,但没有运气。 这是我的上传器源代码:

if (process.argv.length < 7) {
    console.log ("usage: " + process.argv [0] + " " + process.argv[1] + " <config> <region> <bucket> <key> <file>")
    return -1
}

var config = process.argv[2]
var region = process.argv[3]
var bucketName = process.argv[4]
var key = process.argv[5]
var file = process.argv[6]

var fs = require ('fs')
var aws = require ('aws-sdk')
fs.readFile (config, "utf8", function (err, configFile) {
    if (err) {
        console.log ("Config file cannot be read: ", err)
        return -1
    }
    aws.config = JSON.parse (configFile)
    aws.config.region = region

    var bucket = new aws.S3 ({params: {Bucket: bucketName}})

    fs.readFile (file, function (err, fileData) {
        if (err) {
            console.log ("Cannot open file for uploading: ", err);
        } else {
            bucket.createBucket (function () {
                var data = {Key: key, Body: fileData}
                bucket.putObject (data, function (err, data) {
                    if (err) {
                        console.log ("Error uploading data: ", err);
                    } else {
                        console.log ("Successfully uploaded!");
                    }
                })
            })
        }
    })
})

我现在不知道,请帮忙。 也许aws的分段上传是上传大文件的解决方案吗?

1 个答案:

答案 0 :(得分:0)

虽然我无法找出堆叠大小超过最大值的原因,但我找到了原因。

问题是S3不快,网络不稳定使情况更糟。 这与this question

有关

解决此问题的一种方法是使用带有重试机制的多部分上传(sample code,可在1中找到)。