优化php和mysql的内存管理

时间:2014-01-13 04:08:49

标签: php mysql optimization memory-management

我有一张包含大约100000条记录的表格,结构如下所示

id  | name    | desc   | length   | breadth | -------------|remark  //upt o 56 fields
1     FRT-100  -desc-       10      10                      remarking
----------------------------------------------------------------
----------------------------------------------------------------

我这样做是使用cronjob(Gearman)将所有这些数据写入csv,我的代码如下所示

        <?php 
      set_time_limit (0);
     ini_set("memory_limit","2048M");
        //get the total count of records,so that we can loop it in small chunks
        $query = "SELECT COUNT(*) AS cnt FROM tablename WHERE company_id = $companyid";

            $result = $link->query($query);
            $count = 0;
            while ($row = mysqli_fetch_array($result)) {
                $count = $row["cnt"];
            }
            if ($count > 1000) {
                $loop = ceil($count / 1000);
            } else {
                $loop = 1;
            }


        // im going to write it in small chunks of 1000's each time to avoid time out

            for ($ii = 1; $ii <= $loop; $ii++) { 

                if ($ii == 1) {
                    $s = 1;
                } else {
                    $s = floatval(($ii * 1000) - 1000);
                }

                $q = "SELECT * FROM datas WHERE group_company_id = $companyid  LIMIT 1000 OFFSET $s";
                $r = $link->query($q);       

        while ($row2 = mysqli_fetch_array($r)) {
                    //my csv writing will be done here and its working fine for records up to 10,000 ~ 12,000 after than memory exhaustion occours
        }
}
    ?>

我强烈怀疑可以在mysql的偏移功能中优化一些东西。有人能告诉我一个更好的优化方法吗?对任何建议持开放态度(CRON,第三方图书馆..等)

1 个答案:

答案 0 :(得分:2)

尝试避免一次将所有内容存储在内存中,而是加载每一行,然后一次写出一行结果。

<?php
$q = "SELECT * FROM datas";
$r = $link->query($q);       
$fp = fopen("out.csv","w+"); 
// Or you could just set the headers for content type, and echo the output
while ($row2 = mysqli_fetch_array($r)) {
  fwrite($fp, implode(",",$row2)."\n");
}
fclose($fp);

这应解决问题,内存中没有任何内容。