我生成了一个大型csv文件(100M +)并将其上传到AWS S3。该文件已压缩。现在我需要将文件下载到用户,并且它遇到了内存不足的问题。
请参阅下面的代码:
header('Content-Type: text/csv; charset=utf-8');
header("Content-Disposition: attachment; filename=example.csv");
$result = $this->loadFromS3(); //This returns the file from AWS S3
echo bzdecompress($result['Body']);
如何调整代码以避免内存耗尽?
答案 0 :(得分:0)
简单地说,不要存储在$ result中,流式传输文件。使用Amazon S3 Stream Wrapper和
$client->registerStreamWrapper();
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/test.bz2', 'r')) {
stream_filter_append($stream, 'bzip2.decompress',STREAM_FILTER_READ);
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}