Uploader可以正常工作,直到文件大于100,000行为止。我没有编写代码,但我想修复它。我使用其他语言,但未使用PHP。我知道有多种方法可以解决此问题,但是我不确定时间的最佳投资。理想情况下,我希望上传器接受任何大小的文件。更改内存分配似乎是最快的解决方法,但是我希望文件超出内存时会出现长期问题。刷新内存并分批上载似乎是同一枚硬币的两面,但是,上载器当前将仅处理一个文件和一个上载到数据库,每次上载文件时,它都会删除先前的数据并将其替换为文件中的数据。具体来说,我一直在调整CSV上传器,而不是XLSX上传器。
我已经尝试给程序分配额外的内存,但未成功,但它使服务器崩溃,因此我不想再这样做。我还尝试了批处理csv文件,但也失败了。
<?php
class Part {
public $id;
public $oem;
public $part_number;
public $desc;
// Assigning the values
public function __construct($id, $oem, $part_number, $desc) {
$this->id = $id;
$this->oem = $oem;
$this->part_number = $part_number;
$this->desc = $desc;
}
}
//imports single csv file and returns an array of Parts
function importCSVpartfinder($filename, $brand, $root){ //$filename is a dataTable of dimensions: first row contains dimension labels, second row are units, the first column is the part number
$handle = fopen($filename, 'r') or die('unable to open file: $filename');
$contents = fread($handle, filesize($filename));
fclose($handle);
$row = explode("\r" , $contents);
$data = array();
$data2 = array();
for ($i=0; $i < sizeof($row); $i++) {
$columns = explode(",", $row[$i]);
array_push($data, $columns);
}
$all = array(); //array of all Parts
//I should probably sanatize here
for ($i=0; $i < sizeof($data); $i++) {
if (sizeof($data[$i]) != 1){
$id = $data[$i][0];
$oem = $data[$i][1];
$part_number = $data[$i][2];
$desc = $data[$i][3];
$obj = new Part($id, $oem, $part_number, $desc);
array_push($all, $obj);
}
}
return $all;
}
//returns a message with # of succes and list of failures //this is slow with large uploads
function addPartsToDB($data, $connection){ //$data is an array of Parts
//delete
$deleteSQL = "DELETE FROM Part_finder WHERE 1";
$res = $connection->query($deleteSQL);
if (!$res){
echo " Failed to delete Part_finder data, ";
exit;
}
//insert
$e=0;
$s=0;
$failures = "";
$d="";
for ($i=0; $i < sizeof($data); $i++) {
$d .= "(".$data[$i]->id.",'".$data[$i]->oem."','".$data[$i]->part_number."','".$data[$i]->desc."'),";
$s++;
}
$d = substr($d, 0, -1);
$sqlquery = "INSERT INTO Part_finder (id_part, oem, part_number, description) VALUES $d";
$res = $connection->query($sqlquery);
if (!$res){
$sqlError = $connection->error;
return ( $s." items failed to update. Database error. ".$sqlError);
}else{
return ( $s." items updated.");
}
/*
for ($i=0; $i < sizeof($data); $i++) {
$d = "(".$data[$i]->id.",'".$data[$i]->oem."','".$data[$i]->part_number."','".$data[$i]->desc."')";
$sqlquery = "INSERT INTO Part_finder (id_part, oem, part_number, description) VALUES $d";
#$res = $connection->query($sqlquery);
if (!$res){
$failures .= $data[$i]->part_number . "
" ;
$e++;
}else{
$s++;
}
}*/
#return $sqlquery;
}
function importXLSXpartfinder($filename, $root){
require($root.'./plugins/XLSXReader/XLSXReader.php');
$xlsx = new XLSXReader($filename);
/* $sheetNames = $xlsx->getSheetNames();
foreach ($sheetNames as $Name) {
$sheetName = $Name;
}*/
$sheet = $xlsx->getSheet("Sheet1");
$rawData = $sheet->getData();
#$columnTitles = array_shift($rawData);
$all = array(); //array of all Parts
for ($i=0; $i < sizeof($rawData); $i++) {
if (sizeof($rawData[$i]) != 1){
$id = $rawData[$i][0];
$oem = $rawData[$i][1];
$part_number = $rawData[$i][2];
$desc = $rawData[$i][3];
$obj = new Part($id, $oem, $part_number, $desc);
array_push($all, $obj);
}
}
return $all;
}
$filename = $file["partfinder"]["tmp_name"];
if($file["partfinder"]["size"] > 100000000){
echo "File too big".$file["partfinder"]["size"];
exit;
}
//$file comes from edit.php
if($file["partfinder"]["type"] === "text/csv" ) {
$a = importCSVpartfinder($filename, $brand, $root);
}elseif ($file["partfinder"]["type"] === "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet" ) {
$a = importXLSXpartfinder($filename, $root);
}else{
var_dump($file["partfinder"]["type"]);
echo ".xlsx or .csv file types only";
exit;
}
$b = addPartsToDB($a,$connection);
echo $b;
?>
内存耗尽当前发生在第25行
$columns = explode(",", $row[$i]);
,错误代码为
Fatal error: Allowed memory size of 94371840 bytes exhausted (tried to allocate 20480 bytes) in /www/tools/import-csv-partfinder.php on line 25
理想情况下,我仍然想上传一个文件来更新数据库,并且我需要更改其他程序才能上传多个文件,或者在每次上传过程中都不会擦除数据库。不幸的是,我无法与最初编写程序的人联系,所以我自己一个人来解决这个问题。
答案 0 :(得分:1)
我建议您使用生成器读取CSV,而不是将整个内容读取到一个数组中(实际上是两个 数组,其当前编写方式)。这样一来,您一次只能在存储器中保留一行。
function importCSVpartfinder($filename = '') {
$handle = fopen($filename, 'r');
while (($row = fgetcsv($handle)) !== false) {
yield $row;
}
fclose($handle);
}
然后对于您的数据库插入函数,使用准备好的语句并迭代生成器,对文件中的每一行执行该语句。
function addPartsToDB($parts, $connection) {
$connection->query('DELETE FROM Part_finder');
$statement = $connection->prepare('INSERT INTO Part_finder
(id_part, oem, part_number, description)
VALUES (?, ?, ?, ?)');
foreach ($parts as $part) {
$statement->execute($part);
}
}
这些示例被简化只是为了说明概念。您应该能够使它们适应您的确切需求,但是它们是书面的工作示例。
addPartsToDB(importCSVpartfinder($filename), $connection);