任何能指出正确方向的人为什么我不能将数据导入mongodb?当我尝试仅导入总文件的前100行时,我得到
数据库操作git :(主)✗节点import_acparts_to_mongdb.js (node:10216)警告:检测到可能的EventEmitter内存泄漏。 11 关闭听众添加。使用emitter.setMaxListeners()来增加e 限制➜数据库操作git :(主)✗
我尝试从同一个文件导入600.000行,这是一个具有以下结构的csv文件:
设施; ITEM_NUMBER; part_name; part_description; net_weight; customs_statistical PBL; 5535210444; COVER; COVER; 0; 84314980 D37; 5535211545;支架;支架式防火支架A101-20; 2,939; 72169110 PBL; 5535211234; BRACKET; BRACKET-FIRE SUPP TANK A101-20; 2,939; 84314300 PBL; 5535212478; RING-SNAP; RING-SNAP; 0045; 84314980 ....... .......
➜数据库操作git :(主)✗节点import_acparts_to_mongdb.js
< ---最后几个GC --->
38787 ms:Mark-sweep 1384.9(1436.8) - > 1384.8(1436.8)MB,1181.9 / 0.0 ms [分配失败] [请求旧空间中的GC]。 39964 ms:Mark-sweep 1384.8(1436.8) - > 1384.8(1436.8)MB,1177.7 / 0.0 ms [分配失败] [请求旧空间的GC]。 41199毫秒: Mark-sweep 1384.8(1436.8) - > 1385.8(1420.8)MB,1234.0 / 0.0 ms [不得已的gc]。 42429 ms:Mark-sweep 1385.8(1420.8) - > 1386.9 (1420.8)MB,1229.8 / 0.0 ms [last resort gc]。
< --- JS stacktrace --->
==== JS堆栈跟踪======================================== =
安全上下文:0x4962c9cfb39 1:$ __ validate [/Users/isaklafleur/Dropbox/Isak/Coding/Other/autoMDM/node_modules/mongoose/lib/document.js:~1404] [pc = 0xe52ebc4f d97](这= 0x383867c1f221,回调= 0x383867c201e1) 2:验证[/Users/isaklafleur/Dropbox/Isak/Coding/Other/autoMDM/node_modules/mongoose/lib/document.js:~1324] [PC = 0X ...
致命错误:CALL_AND_RETRY_LAST分配失败 - JavaScript堆 内存不足1:node :: Abort()[/ usr / local / bin / node] 2: node :: FatalException(v8 :: Isolate *,v8 :: Local, v8 :: Local)[/ usr / local / bin / node] 3: v8 :: internal :: V8 :: FatalProcessOutOfMemory(char const *,bool) [/ usr / local / bin / node] 4:v8 :: internal :: Factory :: NewFillerObject(int, bool,v8 :: internal :: AllocationSpace)[/ usr / local / bin / node] 5: V8 ::内部:: Runtime_AllocateInTargetSpace(INT, v8 :: internal :: Object **,v8 :: internal :: Isolate *)[/ usr / local / bin / node] 6:0xe52eb8079a7 [1] 10085中止节点 import_acparts_to_mongdb.js➜数据库操作git :( master)✗
const mongoose = require('mongoose'),
parse = require('csv-parse'),
path = require('path'),
fs = require('fs'),
ACpart = require('./models/acparts');
mongoose.Promise = require('bluebird');
mongoose.connect('mongodb://localhost/automdm_test');
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', function() {
// we're connected!
const p = path.join(__dirname, '/../', 'file-operations', 'csv-files');
//console.log(p);
const parser = parse({delimiter: ';'}, function(err, data){
//console.log(data);
const facility = data.map((item,i) => data[i][0]);
const item_number = data.map((item,i) => data[i][1]);
const part_name = data.map((item,i) => data[i][2]);
const part_description = data.map((item,i) => data[i][3]);
const net_weight = data.map((item,i) => data[i][4]);
const customs_statistical = data.map((item,i) => data[i][5]);
// Looping and storing the data into mongodb
for (let i = 1; i < data.length; i++) {
const newACpart = new ACpart();
newACpart.facility = facility[i]
newACpart.item_number = item_number[i];
newACpart.part_name = part_name[i];
newACpart.part_description = part_description[i];
newACpart.net_weight = net_weight[i];
newACpart.customs_statistical = customs_statistical[i];
newACpart.save()
.then(function() {
mongoose.disconnect();
})
.catch(function(err) {
console.log('There was an error', err);
});
}
});
fs.createReadStream(p + '/mrsparts.csv').pipe(parser);
});
答案 0 :(得分:1)
如果它比你的堆大,你将无法将所有内容都放在内存中。使用流式CSV解析器,如下所示:
然后分批将其发送到数据库而不是一次性。