我有一个大约180万行的CSV文件。我需要从PHP脚本中将它们插入MySQL表中。我将批量插入10,000个值。 脚本运行很长时间,并在插入80-95批次后崩溃。我也尝试了mysql_unbuffered_query()但没有用。
if ($fp) {
$batch = 1;
$row_count = 1;
$bucket_counter = 1;
$mobile_numbers = array();
$row_count_for_DB_write = 0;
foreach ($campaign_numbers as $value) {
$number = array($value);
fputcsv($fp, $number);
$row_count_for_DB_write++;
$value_row = new stdClass();
$value_row->number = $value;
$value_row->bucket_number = $bucket_counter;
$mobile_numbers[] = $value_row;
if ($row_count == $bucket_size && $bucket_counter < $bucket_count) {
$bucket_counter++;
$row_count = 1;
fclose($fp);
$fp = fopen($directory . "/cn_$bucket_counter.csv", 'w');
$logger->debug('Created csv file : ' . $directory . '/cn_$bucket_counter.csv');
}
if ($row_count_for_DB_write == CONSTANTS::BATCH_SIZE) {
$logger->debug($batch." Batch insert starting at: ".date('d-m-Y_H-i-s', time()));
$insert_count = $data_service->add_to_mobile_numbers_table($mobile_numbers_table, $mobile_numbers);
$batch++;
$logger->debug("Batch insert ending at: ".date('d-m-Y_H-i-s', time()));
$row_count_for_DB_write = 1;
unset($mobile_numbers);
$mobile_numbers = array();
}
$row_count++;
}
}
fclose($fp);
$data_service->add_to_mobile_numbers_table($mobile_numbers_table, $mobile_numbers);
$zip_file = "/$directory_name.zip";
$logger->debug('Creating zipped file');
Util::create_zip(Util::get_list_of_files($directory), $directory . $zip_file);
答案 0 :(得分:2)
以下步骤解决了这个问题:
将表引擎从InnoDB更改为MyISAM
禁用密钥
插入数据
重新启用密钥