针对每个条目优化大型数组中文件处理的搜索的运行时间

时间:2019-03-20 18:48:07

标签: php laravel optimization

我已经开发了使用Laravel上传和处理文件的方法。但是运行时间很长。

文件如下:(非常大,每个文件约有5万手)

QhQs3s2s@86,QdQs3s2s@86,QcQs3s2s@86,KhKs3s2s@100,KdKs3s2s@100,KcKs3s2s@100,AhAs3s2s@86,AdAs3s2s@86,AcAs3s2s@86

通过txt上传将其上传,然后分成1000个“手”的集合

/**
 * Upload the create Files
 */
public function uploadFile(Request $request)
{
    // process SituationName
    $name = $request->input('name');
    $situation = Situation::firstOrCreate(['name' => $name, 'active' => 1]);

    //process RaiseRange
    $action = Action::where('name', 'Raise')->first();
    $path = $request->file('rangeRaise')->store('ranges');

    //Split Files
    $content = Storage::disk('local')->get($path);
    $array = explode(",", $content);
    $arrayFinal = array_chunk($array, 1000);

    foreach($arrayFinal as $arrayJob){
        $filename = 'ranges/RaiseFile'.uniqid().'.txt';
        Storage::disk('local')->put($filename, json_encode($arrayJob));
        ProcessRangeFiles::dispatch($action, $situation, $filename);
    }
}

然后将其作为具有以下句柄的作业分发

public function handle()
{
    Log::info('File Processing started');
    $array = null;
    $content = null;
    $found = null;

    $path = $this->path;
    $action = $this->action;
    $situation = $this->situation;

    $hands = Hand::all();

    $content = json_decode(Storage::disk('local')->get($path));

    foreach ($content as $key=>$line){
        $array[$key] = explode('@', $line);
        foreach($hands as $hand){
            if($hand->hand == $array[$key][0]){
                $found = $hand;
                break;
            }
        }
        DB::table('hands_to_situations_to_actions')->insert(
            ['hand_id' => $found->id, 'action_id' => $action->id, 'situation_id' => $situation->id, 'percentage' => $array[$key][1], 'created_at' => Carbon::now()->toDateTimeString(), 'updated_at' => Carbon::now()->toDateTimeString()]
        );
    }
    Log::info('File Processing finished');
}

$ hands充满了所有可能的奥马哈扑克之手。

有人知道如何优化此代码吗?那么,每1000个块的运行时间约为12分钟。

0 个答案:

没有答案