我有一个40M的zip文件,其地址为http://info/data/bigfile.zip,我想将其下载到本地服务器。目前使用PHP或标头请求下载该大小的zip文件的最佳方法是什么,以便它不会在8M超时或给我500错误?现在,我一直在超时。
答案 0 :(得分:1)
为了通过php下载大文件尝试这样的事情(来源http://teddy.fr/2007/11/28/how-serve-big-files-through-php/):
<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Here goes your code for checking that the user is logged in
// ...
// ...
$filename = 'path/to/your/file'; // url of your file
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($filename);
?>
**第二种解决方案**
一次将文件复制一个小块
/**
* Copy remote file over HTTP one small chunk at a time.
*
* @param $infile The full URL to the remote file
* @param $outfile The path where to save the file
*/
function copyfile_chunked($infile, $outfile) {
$chunksize = 10 * (1024 * 1024); // 10 Megs
/**
* parse_url breaks a part a URL into it's parts, i.e. host, path,
* query string, etc.
*/
$parts = parse_url($infile);
$i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
$o_handle = fopen($outfile, 'wb');
if ($i_handle == false || $o_handle == false) {
return false;
}
if (!empty($parts['query'])) {
$parts['path'] .= '?' . $parts['query'];
}
/**
* Send the request to the server for the file
*/
$request = "GET {$parts['path']} HTTP/1.1\r\n";
$request .= "Host: {$parts['host']}\r\n";
$request .= "User-Agent: Mozilla/5.0\r\n";
$request .= "Keep-Alive: 115\r\n";
$request .= "Connection: keep-alive\r\n\r\n";
fwrite($i_handle, $request);
/**
* Now read the headers from the remote server. We'll need
* to get the content length.
*/
$headers = array();
while(!feof($i_handle)) {
$line = fgets($i_handle);
if ($line == "\r\n") break;
$headers[] = $line;
}
/**
* Look for the Content-Length header, and get the size
* of the remote file.
*/
$length = 0;
foreach($headers as $header) {
if (stripos($header, 'Content-Length:') === 0) {
$length = (int)str_replace('Content-Length: ', '', $header);
break;
}
}
/**
* Start reading in the remote file, and writing it to the
* local file one chunk at a time.
*/
$cnt = 0;
while(!feof($i_handle)) {
$buf = '';
$buf = fread($i_handle, $chunksize);
$bytes = fwrite($o_handle, $buf);
if ($bytes == false) {
return false;
}
$cnt += $bytes;
/**
* We're done reading when we've reached the conent length
*/
if ($cnt >= $length) break;
}
fclose($i_handle);
fclose($o_handle);
return $cnt;
}
根据需要调整$ chunksize变量。这只是经过了温和的测试。由于种种原因,它很容易破裂。
用法:
copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');
答案 1 :(得分:0)
没有给出太多详细信息,但听起来像php.ini默认设置限制了服务器及时通过php web界面传输大文件的能力。
即这些设置post_max_size
,upload_max_filesize
或max_execution_time
可以跳转到你的php.ini文件,调整大小,重新启动Apache,然后重试文件传输。
HTH