在Nodejs中解析大型JSON文件并独立处理每个对象

时间:2017-03-20 05:31:55

标签: javascript json node.js parsing

我需要在Nodejs中读取一个大的JSON文件(大约630MB)并将每个对象插入到MongoDB中。

我在这里读到了答案:Parse large JSON file in Nodejs

然而,答案是逐行处理JSON文件,而不是逐个对象处理它。因此,我仍然不知道如何从该文件中获取对象并进行操作。

我的JSON文件中有大约100,000个这类对象。

数据格式:

[
  {
    "id": "0000000",
    "name": "Donna Blak",
    "livingSuburb": "Tingalpa",
    "age": 53,
    "nearestHospital": "Royal Children's Hospital",
    "treatments": {
        "19890803": {
            "medicine": "Stomach flu B",
            "disease": "Stomach flu"
        },
        "19740112": {
            "medicine": "Progeria C",
            "disease": "Progeria"
        },
        "19830206": {
            "medicine": "Poliomyelitis B",
            "disease": "Poliomyelitis"
        }
    },
    "class": "patient"
  },
 ...
]

干杯,

亚历

2 个答案:

答案 0 :(得分:23)

有一个名为'stream-json'的漂亮模块可以完全满足您的需求。

  

它可以解析远远超过可用内存的JSON文件。

  

StreamArray处理一个常见的用例:类似于Django生成的数据库转储的大量相对较小的对象。它可以单独流式传输阵列组件,并自动组装它们。

这是一个非常基本的例子:

const StreamArray = require('stream-json/streamers/StreamArray');
const path = require('path');
const fs = require('fs');

const jsonStream = StreamArray.withParser();

//You'll get json objects here
//Key is an array-index here
jsonStream.on('data', ({key, value}) => {
    console.log(key, value);
});

jsonStream.on('end', () => {
    console.log('All done');
});

const filename = path.join(__dirname, 'sample.json');
fs.createReadStream(filename).pipe(jsonStream.input);

如果你想做更复杂的事情,例如按顺序处理一个对象(保持顺序)并为每个对象应用一些异步操作然后你可以像这样执行自定义可写流:

const StreamArray = require('stream-json/streamers/StreamArray');
const {Writable} = require('stream');
const path = require('path');
const fs = require('fs');

const fileStream = fs.createReadStream(path.join(__dirname, 'sample.json'));
const jsonStream = StreamArray.withParser();

const processingStream = new Writable({
    write({key, value}, encoding, callback) {
        //Save to mongo or do any other async actions

        setTimeout(() => {
            console.log(value);
            //Next record will be read only current one is fully processed
            callback();
        }, 1000);
    },
    //Don't skip this, as we need to operate with objects, not buffers
    objectMode: true
});

//Pipe the streams as follows
fileStream.pipe(jsonStream.input);
jsonStream.pipe(processingStream);

//So we're waiting for the 'finish' event when everything is done.
processingStream.on('finish', () => console.log('All done'));

请注意:以上示例针对'stream-json@1.1.3'进行了测试。对于某些以前的版本(大概是1.0.0),您可能需要:

const StreamArray = require('stream-json/utils/StreamArray');

然后

const jsonStream = StreamArray.make();

答案 1 :(得分:0)

我专门为此做了一个包,如果你熟悉rxjs,你会有宾至如归的感觉:

rxjs-stream