我已经从表tb_project_milestones中获取了数据,并希望使用流将该表projectMilestoneRow插入表tb_xyz中。我检查了the documentation,但找不到实现方法。 是否有人在MySQL中实现了流的读取和流的插入。
let insertProjectMilestones = [];
const getProjectMilestones = executeQueryStream.query('SELECT * FROM tb_project_milestones WHERE project_id = ? ');
getProjectMilestones
.on('error', function(err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on('result', function(projectMilestoneRow) {
// Pausing the connnection is useful if your processing involves I/O
connection.pause();
processRow(projectMilestoneRow, function() {
_.each(payload.projects, (project_id)=> {
_.each(projectMilestoneRow, (el)=> {
insertProjectMilestones.push([el.project_milestone_id, el.name, el.prefix, el.short_name, el.description, el.pre_requisites, project_id,
el.milestone_template_id, el.generic_milestone_id, el.planned_date, el.actual_date, el.forecast_date,
el.planned_date_only, el.forecast_date_only, el.actual_date_only, el.planned_time_only, el.forecast_time_only, el.actual_time_only,
el.planned_date_formula, el.actual_date_formula, el.forecast_date_formula, el.planned_date_is_active, el.forecast_date_is_active,
el.actual_date_is_active, el.creation_datetime, el.allow_notes, el.forecast_date_allow_notes, el.actual_date_allow_notes,
el.planned_date_allow_notes, 0, el.requires_approval]);
});
});
connection.resume();
});
})
.on('end', function() {
// all rows have been received
});
编辑
在这种情况下,我使用流是因为从tb_project_milestones中提取了数百万条记录,然后将它们插入到数组中(经过操作),然后推入另一个表中。
考虑到将数组中的许多行压入会增加我在这里考虑使用流的节点的内存这一事实。
是更好的选择,还是我可以使用事务在DB中实现批量插入?
答案 0 :(得分:2)
您可以为此使用knex流和异步迭代(ES2018 / Node 10)
const knexClient = knex(someMysqlClientSettings);
const dbStream = knexClient("tb_project_milestones").where({ projectId }).stream();
for await (const row of dbStream){
const processedRowObj = process(row);
await knexClient("tb_xyz").insert(processedRowObj)
}
答案 1 :(得分:2)
执行单个SQL语句是否会更快,更简单:
INSERT INTO insertProjectMilestones (...)
SELECT ... FROM tb_project_milestones;
这样,数据就不会被铲到客户端,而只是被翻转并铲回到服务器。
您可以同时进行转换(SELECT
中的表达式)和/或过滤(WHERE
中的SELECT
)。
MySQL基本上不会限制表的大小。