我的脚本运行内存时遇到问题。
我需要脚本循环遍历数据库中的每个客户,然后获取所有产品数据并生成文本文件。每个客户可以拥有1到100,000种产品。
我以1,000个批量显示产品数据并写入文件以尝试阻止脚本超时。这已经有了很大的改进,但是,我仍然遇到有大量产品的客户问题。对于拥有5,000多种产品的客户而言似乎存在问题。
似乎在第5批(5,000 prods)之后停止写入文件,但浏览器只是挂起,好像它仍在生成文件但文件中的产品no永远不会增加。
有人可以帮忙吗?
set_time_limit(0);
$db = new msSqlConnect('db');
$select = "SELECT customer FROM feeds ";
$run = mssql_query($select);
while($row = mssql_fetch_array($run)){
$arg = $row['customer'];
$txt_file = 'shopkeeper/'. $arg . '.txt';
$generated = generateFeed($db, $arg, $txt_file);
if ($generated){
$update = "UPDATE feeds SET lastGenerated = '$generated' WHERE customer = '$arg' ";
mssql_query($update);
}
}
function generateFeed($db, $customer, $file){
//if file already exists then delete file so can write new file
if (file_exists($file)){
unlink($file);
}
$datafeed_separator = "|";
//get product details
$productsObj = new Products($db, customer)
//find out how many products customer has
$countProds = $productsObj->countProducts();
$productBatchLimit = 1000;
//create new file
$fh = fopen($file, 'a');
$counter = 1;
for ($i = 0; $i < $countProds; $i += $productBatchLimit) {
$txt = '';
$limit = $productBatchLimit*$counter;
$products = $productsObj->getProducts($i, $limit);
foreach($products as $product){
$txt .=
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "\n";
}
}
fwrite($fh, $txt);
flush();
$counter++;
}
fclose($fh);
$endTime = date('Y-m-d H:i:s');
return $endTime;
}
答案 0 :(得分:1)
我可以看到一件可能对你的内存使用有所帮助的事情。如果你在foreach循环中移动fwrite(),你也可以在循环内释放$ txt。所以它会是这样的:
foreach($products as $product){
$txt =
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "\n";
fwrite($fh, $txt);
}
如果您有许多产品,这将阻止$ txt增长。