这是我的代码,它具有无限循环和睡眠60秒,然后再次从其他网站下载内容。当我在wamp中使用本地机器时工作正常,但是当我上传它时它在一段时间之后才停止工作。我无法找到原因是什么.. ??
<?php include('scrapconnection.php');
include_once('simple_html_dom.php');
for(;;){
sleep(60);
$keyword = "laptop";
$result = mysql_query("select * from website");
while ($row = mysql_fetch_array($result)) {
$searchLink = $row['searchLink'];
$rootElement = $row['rootElement'];
$productTitle = $row['productTitle'];
$productLink = $row['productLink'];
$productPrice = $row['productPrice'];
$productImage = $row['productImage'];
$productDescription = $row['productDescription'];
$newurl = str_replace("__search_keyword__", $keyword , $searchLink);
$url = $newurl;
$html = file_get_html($url);
$pat[0] = "/^\s+/";
$pat[1] = "/\s{2,}/";
$pat[2] = "/\s+\$/";
$rep[0] = "";
$rep[1] = " ";
$rep[2] = "";
foreach($html->find("$rootElement") as $heading) {
$item['productTitle'] = preg_replace($pat, $rep, $heading->find("$productTitle", 0)->plaintext);
$item['productLink'] = preg_replace($pat, $rep, $heading->find("$productLink", 0)->href);
$item['productImage'] = preg_replace($pat, $rep, $heading->find("$productImage", 0)->src);
$item['productPrice'] = preg_replace($pat, $rep, $heading->find("$productPrice", 0)->innertext);
$item['productDescription'] = preg_replace($pat, $rep, $heading->find("$productDescription", 0)->plaintext);
preg_match('@^(?:http://www.)?([^/]+)@i',$item['productLink'], $matches);
$item['domainName'] = $matches[1];
$articles[] = $item;
}
}
unlink('http://xab.com/files/'.$keyword.'.json');
$deal=json_encode($articles);
file_put_contents('http://xab.com/files/'.$keyword.'.json', $deal);
unset($articles);
}
?>
答案 0 :(得分:1)
这是由于Web服务器超时而发生的,我相信没有共享服务器会允许脚本甚至使用cron运行。尝试获得具有此执行能力的专用服务器。
答案 1 :(得分:0)
尝试在脚本之上添加此内容 -
ini_set("memory_limit", "-1");
结帐:
What happens when the server is in an infinite loop and the client stops?