I have the following while loop, which run one time each day:
$QUERY_AVG_RENTED_SQL = mysql_query("SELECT * FROM `xeon_users_rented` WHERE `clicks` > 0;") or _OP_CRON_00_00_ERROR(mysql_error(), __FILE__, __LINE__);
while($r = mysql_fetch_assoc($QUERY_AVG_RENTED_SQL)){
mysql_query("UPDATE `xeon_users_rented` SET `avg` = '"._OP_AVG($r['since'], $r['clicks'])."' WHERE `xeon_users_rented`.`id` = {$r['id']} LIMIT 1;") or _OP_CRON_00_00_ERROR(mysql_error(), __FILE__, __LINE__);
}
This is the function it runs for each user:
function _OP_AVG($time, $clicks){
$avg['time_1'] = mktime(0, 0, 0, date('m', time()), date('d', time()), date('Y', time()));
$avg['time_2'] = mktime(0, 0, 0, date('m', $time), date('d', $time), date('Y', $time));
$avg['time_3'] = $avg['time_1'] - $avg['time_2'];
$avg['days'] = floor( $avg['time_3'] / 86400 );
if($avg['days'] == 0 || $clicks == 0){
return number_format( 0 , 3 );
} else {
return number_format( ($clicks / $avg['days']), 3 );
}
}
In my xeon_users_rented
table I have around 140k records.
I have set the memory_limit
in php.ini
to 400M
however I need to increase this limit every day, or else my cron job wont run the file, due to it reaches it memory limit.
Is there another way I can loop above, without creating this bottleneck?
答案 0 :(得分:0)
$UPDATE = array();
$QUERY_AVG_RENTED_SQL = mysql_unbuffered_query("SELECT * FROM `xeon_users_rented` WHERE `clicks` > 0;") or _OP_CRON_00_00_ERROR(mysql_error(), __FILE__, __LINE__);
while($r = mysql_fetch_assoc($QUERY_AVG_RENTED_SQL)){
// We can't send another mysql_query while unbufered query did not finished
$UPDATE[] = "UPDATE `xeon_users_rented` SET `avg` = '"._OP_AVG($r['since'], $r['clicks'])."' WHERE `xeon_users_rented`.`id` = {$r['id']} LIMIT 1;";
}
foreach ($UPDATE as $query) {
mysql_query($query);
}