我正在尝试将数据从CSV转储到数据库中。这很费时间,所以我决定使用laravel队列。
服务器和本地主机之间的主要怪异之处是:
$insertJob = (new StoreUser($data))->delay(Carbon::now()->addSeconds(3));
dispatch($insertJob);
$msg = [
'status' => '1'
];
echo json_encode($msg);
exit();
在上述情况下,在分派作业后,它将在本地主机中发送status - 1
。因此,它弹出一条消息“作业完成后,您将收到电子邮件通知”。但是,在服务器中,它会继续显示上传栏,即不会回传状态。
一段时间后,它显示服务器内部错误。
PHP错误日志:
27351#27351: *114475 FastCGI sent in stderr: “PHP message: PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480....)
即使在本地主机上我也没有得到它,我正在分配128MB,而在服务器上大约是1GB。
主管日志:
2019-01-08 15:48:01,121 CRIT Supervisor running as root (no user in config file)
2019-01-08 15:48:01,122 WARN No file matches via include “/etc/supervisor/conf.d/*.conf”
2019-01-08 15:48:01,134 INFO RPC interface ‘supervisor’ initialized
2019-01-08 15:48:01,134 CRIT Server ‘unix_http_server’ running without any HTTP authentication checking
2019-01-08 15:48:01,134 INFO supervisord started with pid 2947
2019-01-08 16:17:45,836 INFO spawnerr: can’t find command ‘/home/forge/site_address/php’
2019-01-08 16:17:46,837 INFO spawnerr: can’t find command ‘/home/forge/site_address/php’
2019-01-08 16:17:48,841 INFO spawnerr: can’t find command ‘/home/forge/site_address/php’
2019-01-08 16:17:51,845 INFO spawnerr: can’t find command ‘/home/forge/site_address/php’
2019-01-08 16:17:51,846 INFO gave up: project_queue entered FATAL state, too many start retries too quickly
2019-01-10 16:14:18,566 INFO spawned: ‘project_queue’ with pid 25817
2019-01-10 16:14:19,591 INFO success: project_queue entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
如果您对StoreUser
内的确切内容感到好奇
$db_header_obj = new Class_user();
$db_header = $db_header_obj->getTableColumns();
$base_user_header = new Base_user();
$base_user_columns = $base_user_header->getTableColumns();
$csv_file_path = storage_path('app/files/class_user/').$filename;
if (!ini_get("auto_detect_line_endings")) {
ini_set("auto_detect_line_endings", TRUE);
}
$csv = Reader::createFromPath($csv_file_path, 'r');
$csv->setOutputBOM(Reader::BOM_UTF8);
$csv->addStreamFilter('convert.iconv.ISO-8859-15/UTF-8');
$csv->setHeaderOffset(0);
$csv_header = $csv->getHeader();
$error_arr = array();
$row_number = array();
$error_row_numbers = array();
$loop = true;
while($loop){
$rec_arr = array();
$records = array();
$records_arr = array();
$stmt = (new Statement())
->offset($offset)
->limit($limit)
;
$records = $stmt->process($csv);
foreach ($records as $record)
{
$rec_arr[] = array_values($record);
}
$records_arr = $service->trimArray($rec_arr);
if(count($records_arr)>0)
{
foreach($records_arr as $ck => $cv){
$existing = NULL;
$class_user_arr = array();
//format datatype and check either the column should be inserted or not
foreach ($map_data as $mk => $mv) {
if(isset($mv)){
$data_type = $service->getDatabaseColumnType($table,$mv);
if($data_type == 'date' || $data_type == 'datetime' || $data_type == 'timestamp'){
$datetime = (array)$cv[$mk];
$dt = array_shift($datetime);
$dt = date('Y-m-d h:i:s', strtotime($dt));
$class_user_arr[$mv] = $dt;
}else{
$class_user_arr[$mv] = $cv[$mk];
}
}
}
//setting some boolean variable
$updated_base = false;
$updated_class = false;
$error_encountered = false;
$base_user_table_id = NULL;
$base_user_arr = array();
foreach($class_user_arr as $cvk => $cvv){
if(in_array($cvk,$base_user_columns))
{
$base_user_arr[$cvk] = $cvv;
}
}
DB::beginTransaction();
//trying to insert into base or first table
try{
$base_user_row = Base_user::updateOrCreate(
['base_id' => $base_user_arr['base_id']],
$base_user_arr
);
if ($base_user_row->wasRecentlyCreated === true) {
$base_user_row->created_by = $this->data['user_id'];
}else{
$base_user_row->updated_by = $this->data['user_id'];
}
$base_user_row->save();
$base_user_table_id = $base_user_row->id;
$updated_base = true;
} catch (\Exception $e) {
$error_encountered = true;
$error_arr[] = $e->getMessage();
$error_row_numbers[] = $row_no;
}
//Using row id from first table inserting data into table number 2
if($error_encountered == false){
try{
$class_user_row = class_user::updateOrCreate(
['base_user_id' => $base_user_table_id],
$class_user_arr
);
if ($class_user_row->wasRecentlyCreated === true) {
$class_user_row->created_by = $this->data['user_id'];
}else{
$class_user_row->updated_by = $this->data['user_id'];
}
$class_user_row->save();
} catch (\Exception $e) {
$error_encountered = true;
$error_arr[] = $e->getMessage();
$error_row_numbers[] = $row_no;
DB::rollback();
}
}
DB::commit();
$row_no = $row_no + 1;
}
$offset = $offset + $limit;
}else{
$activity = new Activity();
$activity->url = $this->data['url'];
$activity->action = 'store';
$activity->description = $table;
$activity->user_id = $this->data['user_id'];
$activity->created_at = date('Y-m-d H:i:s');
$activity->save();
$arr_data = [
'filename' => $filename,
'user_name' => $this->data['user_name'],
'error' => $error_arr,
'error_row_numbers' => $error_row_numbers
];
//informing to the user about the completion of job
Mail::to($this->data['user_email'])->send(new CSVImportJobComplete($arr_data));
$loop = false;
}
}
if (!ini_get("auto_detect_line_endings")) {
ini_set("auto_detect_line_endings", FALSE);
}
我一无所知,可能是什么原因。让我知道您是否需要其他信息。
更新:
production.ERROR: Allowed memory size of 2147483648 bytes exhausted (tried to allocate 20480 bytes)
我非常确定更新分配内存不是解决方案。因为我在128MB的本地服务器上成功处理了该问题,即使在2GB的服务器上也无法解决此问题
答案 0 :(得分:1)
经过大量搜索,我发现这是由于.env
文件中的一个简单差异所致。
我需要更新
QUEUE_DRIVER=sync
到
QUEUE_DRIVER=database
在服务器中。