我已经测试了这个Curl代码,可以同时下载多个页面。但我想知道同时下载的最大允许限制是多少:
<?php
class Footo_Content_Retrieve_HTTP_CURLParallel
{
/**
* Fetch a collection of URLs in parallell using cURL. The results are
* returned as an associative array, with the URLs as the key and the
* content of the URLs as the value.
*
* @param array<string> $addresses An array of URLs to fetch.
* @return array<string> The content of each URL that we've been asked to fetch.
**/
public function retrieve($addresses)
{
$multiHandle = curl_multi_init();
$handles = array();
$results = array();
foreach($addresses as $url)
{
$handle = curl_init($url);
$handles[$url] = $handle;
curl_setopt_array($handle, array(
CURLOPT_HEADER => false,
CURLOPT_RETURNTRANSFER => true,
));
curl_multi_add_handle($multiHandle, $handle);
}
// execute the handles
$result = CURLM_CALL_MULTI_PERFORM;
$running = false;
// set up and make any requests..
while ($result == CURLM_CALL_MULTI_PERFORM)
{
$result = curl_multi_exec($multiHandle, $running);
}
// wait until data arrives on all sockets
while($running && ($result == CURLM_OK))
{
if (curl_multi_select($multiHandle) > -1)
{
$result = CURLM_CALL_MULTI_PERFORM;
// while we need to process sockets
while ($result == CURLM_CALL_MULTI_PERFORM)
{
$result = curl_multi_exec($multiHandle, $running);
}
}
}
// clean up
foreach($handles as $url => $handle)
{
$results[$url] = curl_multi_getcontent($handle);
curl_multi_remove_handle($multiHandle, $handle);
curl_close($handle);
}
curl_multi_close($multiHandle);
return $results;
}
}
原始来源:
答案 0 :(得分:0)
没有限制,但您必须考虑服务器上的互联网连接,带宽,内存泄漏,CPU等