//get the csv file
$file = $_FILES['csv']['tmp_name'];
$handle = fopen($file,"r");
//loop through the csv file and insert into database
do {
if ($data[0]) {
$expression = "/^[_a-z0-9-]+(\.[_a-z0-9-]+)*@[a-z0-9-]+(\.[a-z0-9-]+)*(\.[a-z]{2,3})$/";
if (preg_match($expression, $data[0])) {
$query=mysql_query("SELECT * FROM `postfix`.`recipient_access` where recipient='".$data[0]."'");
mysql_query("SET NAMES utf8");
$fetch=mysql_fetch_array($query);
if($fetch['recipient']!=$data[0]){
$query=mysql_query("INSERT INTO `postfix`.`recipient_access`(`recipient`, `note`) VALUES('".addslashes($data[0])."','".$_POST['note']."')");
}
}
}
} while ($data = fgetcsv($handle,1000,",","'"));
答案 0 :(得分:1)
首先,我压力不够;修复你的缩进 - 这将使每个人的生活更轻松。
其次,答案很大程度上取决于您遇到的实际瓶颈:
<强>的common.php 强>
class FileLineFinder {
protected $handle, $length, $curpos;
public function __construct($file){
$handle = fopen($file, 'r');
$length = strlen(PHP_EOL);
}
public function next_line(){
while(!feof($this->handle)){
$b = fread($this->handle, $this->length);
$this->curpos += $this->length;
if ($b == PHP_EOL) return $this->curpos;
}
return false;
}
public function skip_lines($count){
for($i = 0; $i < $count; $i++)
$this->next_line();
}
public function __destruct(){
fclose($this->handle);
}
}
function exec_async($cmd, $outfile, $pidfile){
exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outfile, $pidfile));
}
<强> main.php 强>
require('common.php');
$maxlines = 200; // maximum lines subtask will be processing at a time
$note = $_POST['note'];
$file = $_FILES['csv']['tmp_name'];
$outdir = dirname(__FILE__) . DIRECTORY_SEPARATOR . 'out' . DIRECTORY_SEPARATOR;
//make sure our output directory exists
if(!is_dir($outdir))
if(!mkdir($outdir, 0755, true))
die('Cannot create output directory: '.$outdir);
// run a task for each chunk of lines in the csv file
$i = 0; $pos = 0;
$l = new FileLineFinder($file);
do {
$i++;
exec_async(
'php -f sub.php -- '.$pos.' '.$maxlines.' '.escapeshellarg($file).' '.escapeshellarg($note),
$outdir.'proc'.$i.'.log',
$outdir.'proc'.$i.'.pid'
);
$l->skip_lines($maxlines);
} while($pos = $l->next_line());
// wait for each task to finish
do {
$tasks = count(glob($outdir.'proc*.pid'));
echo 'Remaining Tasks: '.$tasks.PHP_EOL;
} while ($tasks > 0);
echo 'Finished!'.PHP_EOL;
<强> sub.php 强>
require('common.php');
$start = (int)$argv[1];
$count = (int)$argv[2];
$file = $argv[3];
$note = mysql_real_escape_string($argv[4]);
$lines = 0;
$handle = fopen($file, 'r');
fseek($handle, $start, SEEK_SET);
$expression = "/^[_a-z0-9-]+(\.[_a-z0-9-]+)*@[a-z0-9-]+(\.[a-z0-9-]+)*(\.[a-z]{2,3})$/";
mysql_query('SET NAMES utf8');
//loop through the csv file and insert into database
do {
$lines++;
if ($data[0]) {
if (preg_match($expression, $data[0])) {
$query = mysql_query('SELECT * FROM `postfix`.`recipient_access` where recipient="'.$data[0].'"');
$fetch = mysql_fetch_array($query);
if($fetch['recipient'] != $data[0]){
$query = mysql_query('INSERT INTO `postfix`.`recipient_access`(`recipient`, `note`) VALUES("'.$data[0].'","'.$note.'")');
}
}
}
} while (($data = fgetcsv($handle, 1000, ',', '\'')) && ($lines < $count));
<强>积分强>
答案 1 :(得分:0)
最紧迫的事情是确保您的数据库为properly indexed,以便您为每一行执行的查询查询尽可能快。
除此之外,你根本无法做到这一点。对于多线程解决方案,您必须使用PHP。
您也可以import the CSV file in mySQL,然后使用您的PHP脚本清除多余的数据 - 这可能是最快的方式。
答案 2 :(得分:0)
一般建议:加速任何程序的关键是知道哪个部分占用大部分时间。
然后弄清楚如何减少它。有时您会对实际结果感到非常惊讶。
顺便说一下,我不认为多线程会解决你的问题。答案 3 :(得分:0)
将整个循环放在SQL事务中。这将使事情加快一个数量级。