Laravel-Excel massive import

时间:2015-05-08 10:01:35

标签: php laravel laravel-excel

So, I have an excel file with 28k rows.
I want to load it, then insert into database, but it was just stopped. (blank space)
I've tried to reduce into 5k data, and it worked, but is too slow
I also tried using chunk, with only 5k data, but I got "Maximum execution time of 300 seconds exceeded".
here's the code

ls *.txt | xargs -i mv {} {}_<New >

Is 5k row really that big to handle?
Or am I doing it wrong?
Thanks.

1 个答案:

答案 0 :(得分:0)

使用chunk可以防止过度耗尽内存,但会减慢执行时间。

如果你想要更快的话,增加块数,但要小心。

请注意。大块的每一端,你的应用程序将再次读取文件,这需要时间。