如何在Laravel中快速上传大型CSV文件

时间:2017-03-15 10:43:38

标签: php postgresql csv laravel-5.4

这个问题已被问了很多次,我也尝试了几种方式,但这次我被卡住了,因为我的要求有点具体。这些通用方法都不适用于我。

详情

文件大小= 75MB 总行数= 300000

PHP代码

     protected $chunkSize = 500;

   public function handle()
   {
      try {

            set_time_limit(0);
            $file = Flag::where('imported','=','0')
                     ->orderBy('created_at', 'DESC')
                     ->first();


            $file_path = Config::get('filesystems.disks.local.root') . '/exceluploads/' .$file->file_name;


            // let's first count the total number of rows
            Excel::load($file_path, function($reader) use($file) {
             $objWorksheet = $reader->getActiveSheet();
             $file->total_rows = $objWorksheet->getHighestRow() - 1; //exclude the heading
             $file->save();
            });


            $chunkid=0;
            //now let's import the rows, one by one while keeping track of the progress

            Excel::filter('chunk')
            ->selectSheetsByIndex(0)
            ->load($file_path)
            ->chunk($this->chunkSize, function($results) use ($file,$chunkid) {
              //let's do more processing (change values in cells) here as needed
               $counter = 0;
               $chunkid++;
               $output = new ConsoleOutput();
               $data =array();

                foreach ($results->toArray() as $row) 
                {
                            $data[] = array(
                                        'data'=> json_encode($row),
                                        'created_at'=>date('Y-m-d H:i:s'),
                                        'updated_at'=> date('Y-m-d H:i:s')
                                   );
                            //$x->save();
                            $counter++;

                  }

                DB::table('price_results')->insert($data);
                $file = $file->fresh(); //reload from the database
                $file->rows_imported = $file->rows_imported + $counter;
                $file->save();
                $countx = $file->rows_imported + $counter;
                echo "Rows Executed".$countx.PHP_EOL;
            },
            false
            );

            $file->imported =1;
            $file->save();

           echo "end of execution";
      }
        catch(\Exception $e)
        {
          dd($e->getMessage());
        }
   }

因此,对于10,000行CSV文件,上述代码运行速度非常快。

但是当我上传较大的CSV时,它无效。

我唯一的限制是我必须使用以下逻辑将CSV的每一行转换为KeyPair值json数据

foreach ($results->toArray() as $row) 
                    {
                                $data[] = array(
                                            'data'=> json_encode($row),
                                            'created_at'=>date('Y-m-d H:i:s'),
                                            'updated_at'=> date('Y-m-d H:i:s')
                                       );
                                //$x->save();
                                $counter++;

                      }

任何建议都会受到赞赏,现在已超过和小时,并且仍然只插入了100,000行

我觉得这很慢

数据库:POSTGRES

0 个答案:

没有答案