我正在尝试将平面文件中的738627条记录读入MySQl。该脚本似乎运行正常,但是给了我上述内存错误。
该文件的样本是:
#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285760002827930241
127667880285760002827987861
127667880285760002828089791
127667880285760002828168361
127667880285760002828192041
127667880285760002829144541
127667880285760002829351511
我尝试使用
增加允许的内存ini_set("memory_limit","80M");
它仍然失败。我一直坚持这个,直到它运行?
完整的代码是
<?php
ini_set("memory_limit","80M");
$db = mysql_connect("localhost", "uname", "pword");
// test connection
if (!$db) {
echo "Couldn't make a connection!";
exit;
}
// select database
if (!mysql_select_db("dbname",$db))
{
echo "Couldn't select database!";
exit;
}
mysql_set_charset('utf8',$db);
$delimiter = chr(1);
$eoldelimiter = chr(2) . "\n";
$fp = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.</table></body></html>'; exit;}
$loop = 0;
while (!feof($fp)) {
$loop++;
$line = stream_get_line($fp,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue; //Skip lines that start with #
$field[$loop] = explode ($delimiter, $line);
$fp++;
$export_date = $field[$loop][0];
$genre_id = $field[$loop][1];
$application_id = $field[$loop][2];
$query = "REPLACE into genre_apps
(export_date, genre_id, application_id)
VALUES ('$export_date','$genre_id','$application_id')";
print "SQL-Query: ".$query."<br>";
if(mysql_query($query,$db))
{
echo " OK !\n";
}
else
{
echo "Error<br><br>";
echo mysql_errno() . ":" . mysql_error() . "</font></center><br>\n";
}
}
fclose($fp);
?>
答案 0 :(得分:2)
你的循环无缘无故地填充变量$field
(它在每次循环迭代时写入不同的单元格),从而在每一行中占用更多内存。
您可以替换:
$field[$loop] = explode ($delimiter, $line);
$export_date = $field[$loop][0];
$genre_id = $field[$loop][1];
$application_id = $field[$loop][2];
使用:
list($export_date, $genre_id, $application_id) = explode($delimiter, $line);
为了提高性能,您可以利用REPLACE INTO
通过将N行分组到单个查询中来插入多行的功能。