我使用mysqli
API来查询大表,每1000行,但我服务器的内存增长非常快。内存为0,即使是交换。我不知道如何解决它。
该表有400万行,所以我每次都要查询1000个表。
这是我的代码:
<?php
ini_set('memory_limit','32M');
$config = require_once('config.php');
$attachmentRoot = $config['attachment_root'];
$mysqli = new mysqli($config['DB_HOST'],$config['DB_USER'],$config['DB_PASSWORD'],$config['DB_NAME']);
$mysqli->set_charset('gbk');
if(!$mysqli)
throw new Exception('DB connnect faild:'.mysqli_connect_errno().mysqli_connect_error());
echo "\nRename The Dup Files With Suffix: .es201704111728es \n";
$startTime = microtime(true);
/**
*
* Move dup file to $name + .es201704111728es
*/
$suffix = ".es201704111728es";
$fileLinesLimit = 100000;
$listSuffix = 0;
$lines = 0;
/**
* Create File List.
*/
$fileList = '/tmp/Dupfilelist.txt';
$baseListName = $fileList.$listSuffix;
//$fs = fopen($baseListName,'w');
$totalSize = 0;
$start = 0;
$step = 10000;
$sql = "SELECT id,filepath,ids,duplicatefile,filesize FROM duplicate_attachment WHERE id> $start AND duplicatefile IS NOT NULL LIMIT $step";
$result = $mysqli->query($sql);
while($result->num_rows > 0)
{
while($result->fetch_row())
{
/*$fiepath = $row[1];
$uniqueIdsArray = array_unique(explode(',',$row[2]));if(empty($row[3]))throw new \Exception("\n".'ERROR:'.$row[0]."\n".var_export($row[3],true)."\n");
$uniqueFilesArray = array_unique(explode(',',$row[3]));
$hasFile = array_search($fiepath,$uniqueFilesArray);
if($hasFile !== false)
unset($uniqueFilesArray[$hasFile]);
$num = count($uniqueIdsArray);
$fileNum = count($uniqueFilesArray);
$ids = implode(',',$uniqueIdsArray);
if($num>1 && $fileNum>0){
//echo "\nID: $row[0] . File Need To Rename:".var_export($uniqueFilesArray,true)."\n";
$size = intval($row[4]);
if($lines >= $fileLinesLimit){
$lines = 0;
$listSuffix++;
//$fileList .= $listSuffix;
}
array_map(function($file) use ($attachmentRoot,$suffix,$fiepath,$totalSize,$size,$fileLinesLimit,&$listSuffix,&$lines,$fileList){
//$fs = fopen($fileList.$listSuffix,'a');
if($file === $fiepath)
return -1;
$source = $file;
$target = $source.$suffix;
//rename($source,$target);
//fwrite($fs,$source.','.$target."\n");
//file_put_contents($fileList.$listSuffix, $source.','.$target."\n",FILE_APPEND);
//$totalSize += intval($size);
$lines ++;
//echo memory_get_usage()."\n";
//fclose($fs);
//unset($fs);
//try to write file without amount memory cost
//$ts = fopen('/tmp/tempfile-0412','w');
},$uniqueFilesArray);
//echo "Test Just One Attachment Record.\n";
//echo "Ids:$ids\n";
//exit();
}*/
}
echo memory_get_peak_usage(true)."\n";
if(!$mysqli->ping())
{
echo "Mysql Conncect Failed.Reconnecting.\n";
$mysqli = new mysqli($config['DB_HOST'],$config['DB_USER'],$config['DB_PASSWORD'],$config['DB_NAME']);
$mysqli->set_charset('gbk');
if(!$mysqli)
throw new Exception('DB connnect faild:'.mysqli_connect_errno().mysqli_connect_error());
}
//mysqli_free_result($result);
$result->close();
unset($result);
$start += $step;
$sql = "SELECT id,filepath,ids,duplicatefile,filesize FROM duplicate_attachment WHERE id> $start AND duplicatefile IS NOT NULL LIMIT $step";
$result = $mysqli->query($sql);
}
echo "Dup File Total Size: $totalSize\n";
echo "Script cost time :".(microtime(true)-$startTime)." ms\n";sleep(1000*10);
mysqli_close($mysqli);
exit();
答案 0 :(得分:1)
我启用了XDEBUG扩展。请继续。
我禁用此扩展程序,一切顺利。
答案 1 :(得分:0)
我在 Centos 7 上使用 PHP 版本 7.3.26 时遇到了这个问题。我使用 unbuffered query (instead of buffered) 解决了这个问题。在上面的例子中,替换
$result = $mysqli->query($sql)
与
$result = $mysqli->query($sql, MYSQLI_USE_RESULT)