我有php7 CLI守护进程,它使用50M以上的filesize连续解析json。我尝试使用pcntl_fork()到mysql的单独进程保存每1000个已解析数据条目,对于~20万行,它可以正常工作。
然后我得到 pcntl_fork():错误35 。
我认为这种情况正在发生,因为mysql插入变得比解析慢,这导致生成越来越多的分叉,直到CentOS 6.3无法再处理它。
有没有办法捕获此错误以诉诸单进程解析和保存?或者有没有办法检查子进程数?
答案 0 :(得分:1)
以下是基于@Sander Visser评论的解决方案。关键部分是检查现有流程并采用相同流程(如果存在太多流程)
class serialJsonReader{
const MAX_CHILD_PROCESSES = 50;
private $child_processes=[]; //will store alive child PIDs
private function flushCachedDataToStore() {
//resort to single process
if (count($this->child_processes) > self::MAX_CHILD_PROCESSES) {
$this->checkChildProcesses();
$this->storeCollectedData() //main work here
}
//use as much as possible
else {
$pid = pcntl_fork();
if (!$pid) {
$this->storeCollectedData(); //main work here
exit();
}
elseif ($pid == -1) {
die('could not fork');
}
else {
$this->child_processes[] = $pid;
$this->checkChildProcesses();
}
}
}
private function checkChildProcesses() {
if (count($this->child_processes) > self::MAX_CHILD_PROCESSES) {
foreach ($this->child_processes as $key => $pid) {
$res = pcntl_waitpid($pid, $status, WNOHANG);
// If the process has already exited
if ($res == -1 || $res > 0) {
unset($this->child_processes[$key]);
}
}
}
}
}