我有一个API方法,当调用并传递文件密钥数组时,可以从S3下载它们。我想流式传输它们,而不是下载到磁盘,然后压缩文件并将其返回给客户端。
这是我当前的代码:
reports.get('/xxx/:filenames ', async (req, res) => {
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
var str_array = filenames.split(',');
for (var i = 0; i < str_array.length; i++) {
var filename = str_array[i].trim();
localFileName = './' + filename;
var params = {
Bucket: config.reportBucket,
Key: filename
}
s3.getObject(params, (err, data) => {
if (err) console.error(err)
var file = require('fs').createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(file);
console.log(file);
})
}
});
我将如何流式传输文件而不是将其下载到磁盘,以及如何压缩它们以将其返回给客户端?
答案 0 :(得分:1)
主要问题是要压缩多个文件。
更具体地说,请从AWS S3批量下载它们。
我搜索了AWS SDK,但没有找到批量的s3操作。
这为我们提供了一种可能的解决方案:
这是未经测试的原始示例,但可能会为您带来想法:
// Always import packages at the beginning of the file.
const AWS = require('aws-sdk');
const fs = require('fs');
const zipFolder = require('zip-folder');
const s3 = new AWS.S3();
reports.get('/xxx/:filenames ', async (req, res) => {
const filesArray = filenames.split(',');
for (const fileName of filesArray) {
const localFileName = './' + filename.trim();
const params = {
Bucket: config.reportBucket,
Key: filename
}
// Probably you'll need here some Promise logic, to handle stream operation end.
const fileStream = fs.createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(fileStream);
}
// After that all required files would be in some target folder.
// Now you need to compress the folder and send it back to user.
// We cover callback function in promise, to make code looks "sync" way.
await new Promise(resolve => zipFolder('/path/to/the/folder', '/path/to/archive.zip', (err) => {resolve()});
// And now you can send zipped folder to user (also using streams).
fs.createReadStream('/path/to/archive.zip').pipe(res);
});
注意:根据流的性质,您可能会在异步行为方面遇到一些问题,因此,请首先检查所有文件是否在压缩之前存储在文件夹中。
只需提一下,我尚未测试此代码。因此,如果出现任何问题,让我们一起调试