这是我用于将excel(使用Maatwebsite Laravel-Excel 2)文件导入我的数据库的方法的主要代码:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
DB::table('medidores')->insert([
"med" => $med,
"serial_number" => $serial
]);
}
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
当我有“少量”数据(比如excel文件中少于5000行)时,这很好用。但我需要使用一个大的excel文件,它有140万行,分为1张以上。我怎样才能让我的方法更快?有提示吗?
编辑:我将使用答案评论之一链接上的代码编辑问题:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
$collection = collect($bulk_data); //turn data into collection
$chunks = $collection->chunk(100); //split into chunk of 100's
$chunks->toArray(); //convert chunk to array
//loop through chunks:
foreach($chunks as $chunk)
{
DB::table('medidores')->insert($chunk->toArray());
}
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
大块的东西对我有用。
答案 0 :(得分:2)
是的,你可以,而不是执行X(数据库请求的数量)* N(页数)尝试做一个简单的批量插入,只会花费你的数据保存X * N数据库请求循环的复杂性,这里是一个例如:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
DB::table('medidores')->insert($bulk_data);
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}
有关数据库请求的更多说明,请参阅此答案: https://stackoverflow.com/a/1793209/8008456