在处理MySQL查询结果时如何限制PHP内存使用?

时间:2009-10-08 03:50:25

标签: php mysql memory profiling

所以我有一个PHP页面,允许用户下载CSV,这可能是一大堆记录。问题是MySQL查询返回的结果越多,它使用的内存就越多。这并不奇怪,但确实存在问题。

我尝试使用mysql_unbuffered_query(),但这没有任何区别,所以我需要一些其他方法来释放我所假设的以前处理过的行使用的内存。有没有标准的方法来做到这一点?

这是一个注释日志,说明了我在说什么:

// Method first called
2009-10-07 17:44:33 -04:00 --- info: used 3555064 bytes of memory

// Right before the query is executed
2009-10-07 17:44:33 -04:00 --- info: used 3556224 bytes of memory

// Immediately after query execution
2009-10-07 17:44:34 -04:00 --- info: used 3557336 bytes of memory

// Now we're processing the result set
2009-10-07 17:44:34 -04:00 --- info: Downloaded 1000 rows and used 3695664 bytes of memory
2009-10-07 17:44:35 -04:00 --- info: Downloaded 2000 rows and used 3870696 bytes of memory
2009-10-07 17:44:36 -04:00 --- info: Downloaded 3000 rows and used 4055784 bytes of memory
2009-10-07 17:44:37 -04:00 --- info: Downloaded 4000 rows and used 4251232 bytes of memory
2009-10-07 17:44:38 -04:00 --- info: Downloaded 5000 rows and used 4436544 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 6000 rows and used 4621776 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 7000 rows and used 4817192 bytes of memory
2009-10-07 17:44:40 -04:00 --- info: Downloaded 8000 rows and used 5012568 bytes of memory
2009-10-07 17:44:41 -04:00 --- info: Downloaded 9000 rows and used 5197872 bytes of memory
2009-10-07 17:44:42 -04:00 --- info: Downloaded 10000 rows and used 5393344 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 11000 rows and used 5588736 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 12000 rows and used 5753560 bytes of memory
2009-10-07 17:44:44 -04:00 --- info: Downloaded 13000 rows and used 5918304 bytes of memory
2009-10-07 17:44:45 -04:00 --- info: Downloaded 14000 rows and used 6103488 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 15000 rows and used 6268256 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 16000 rows and used 6443152 bytes of memory
2009-10-07 17:44:47 -04:00 --- info: used 6597552 bytes of memory

// This is after unsetting the variable. Didn't make a difference because garbage
// collection had not run
2009-10-07 17:44:47 -04:00 --- info: used 6598152 bytes of memory

我希望有一些标准技术来处理像这样的大型结果集(甚至更大),但是我的研究还没有发现任何东西。

想法?

以下是一些代码,请求:

    $results = mysql_query($query);

    Kohana::log('info', "used " . memory_get_usage() . " bytes of memory");                

    $first = TRUE;
    $row_count = 0;

    while ($row = mysql_fetch_assoc($results)) {
        $row_count++;
        $new_row = $row;

        if (array_key_exists('user_id', $new_row)) {
            unset($new_row['user_id']);
        }

        if ($first) {
            $columns = array_keys($new_row);
            $columns = array_map(array('columns', "title"), $columns);
            echo implode(",", array_map(array('Reports_Controller', "_quotify"), $columns));
            echo "\n";
            $first = FALSE;
        }

        if (($row_count % 1000) == 0) {
            Kohana::log('info', "Downloaded $row_count rows and used " . memory_get_usage() . " bytes of memory");                
        }

        echo implode(",", array_map(array('Reports_Controller', "_quotify"), $new_row));
        echo "\n";
    }

4 个答案:

答案 0 :(得分:2)

进一步的分析表明问题是某处的内存泄漏。我将代码简化为最简单的形式,并且每次迭代都不会增加内存使用量。我怀疑它是Kohana(我正在使用的框架)。

答案 1 :(得分:0)

这是“实时”下载吗?我的意思是在生成CSV时将其推送给客户端?如果是这样,那么你可以做一些事情:

  1. 不要使用输出缓冲。这会将所有内容保存在内存中,直到您显式或隐式地(通过脚本结尾)刷新它,这将使用更多内存;
  2. 当您从数据库中读取行时,请将它们写入客户端。
  3. 除此之外,我们可能需要看一些骨架代码。

答案 2 :(得分:0)

您是否定期刷新数据? PHP的正常缓冲对于长时间运行的代码来说非常恶劣,因为MySQL客户端,变量和输出系统之间存在多个数据副本。已经有几年了,但我上次回忆起在骨架代码中使用类似的东西:

ob_end_flush()
mysql_unbuffered_query()
while ($row = mysql_fetch…) {
   … do something …   

   flush(); // Push to Apache
   unset($row, … all other temporary variables …);
}

答案 3 :(得分:0)

感谢您使用mysql_unbuffered_query()解决了我的问题,即使用PHP和MYSQL处理大量数据集的RAM。

PHP致命错误:第25行的/content/apps/application_price.php中允许的内存大小为134217728个字节(尝试分配32个字节)