节点流-推送读取无意中分为3个写入流

时间:2019-03-05 23:45:29

标签: javascript node.js stream json2csv

目标:对象将被推送到可读流中,然后根据其来自哪个渠道(电子邮件,推送,应用内)保存在单独的.csv中。

问题:我无法将流分离到不同的.pipe()“行”中,因此所有.csv日志仅接收其特定于通道的事件对象。但是在当前迭代中,Writestream创建的所有.csv文件都从所有通道接收事件对象。

问题: 我可以在setup()函数中以编程方式动态创建多通道“ pipe()线”,还是我目前正采用这种方式的当前方法?

手动创建“ pipe()行”是所有.csv文件都填充了事件的原因吗?可以用一条“ pipe()行”和动态路由来解决吗?

以下代码的简要说明:

setup()调用 makeStreams()-创建一个具有Readable和Writable(旋转文件系统可写流)的对象(setup()是不必要的功能)现在,但稍后将执行更多设置任务。)

pushStream()在发生入站事件并推送诸如以下内容的对象时被调用:{电子邮件:{queryParam:1,queryParam:2等}}该事件按最高级别排序obj(在这种情况下为“ Email”),然后被推入正确的可写流,理论上应将其移植到正确的可写流。 不幸的是,事实并非如此,它将事件对象发送到所有可写流。如何将其仅发送到正确的流?

代码

const Readable = require('stream').Readable
const Json2csvTransform = require('json2csv').Transform;
var rfs = require("rotating-file-stream");

const channelTypes = ['Push Notification', 'Email', 'In-app Message']
var streamArr = setup(channelTypes);
const opts = {};
const transformOpts = {
    objectMode: true
};

const json2csv = new Json2csvTransform(opts, transformOpts);

function setup(list) {
    console.log("Setting up streams...")
    streamArr = makeStreams(list) //makes streams out of each endpoint
    return streamArr
}

//Stream Builder for Logging Based Upon Channel Name
function makeStreams(listArray) {
    listArray = ['Push Notification', 'Email', 'In-app Message']
    var length = listArray.length
    var streamObjs = {}
    for (var name = 0; name < length; name++) {
        var fileName = listArray[name] + '.csv'
        const readStream = new Readable({
            objectMode: true,
            read() {}
        })
        const writeStream = rfs(fileName, {
            size: "50M", // rotate every 50 MegaBytes written
            interval: "1d" // rotate daily
            //compress: "gzip" // compress rotated files
        });

        var objName = listArray[name]
        var obj = {
            instream: readStream,
            outstream: writeStream
        }
        streamObjs[objName] = obj
    }
    return streamObjs
}

function pushStream(obj) {
    var keys = Object.keys(obj)

        if (streamArr[keys]) {
        streamArr[keys].instream.push(obj[keys])
    } else {
        console.log("event without a matching channel error")
    }
}

//Had to make each pipe line here manually. Can this be improved? Is it the reason all of the files are receiving all events?
streamArr['Email'].instream.pipe(json2csv).pipe(streamArr['Email'].outstream)
streamArr['In-app Message'].instream.pipe(json2csv).pipe(streamArr['In-app Message'].outstream)
streamArr['Push Notification'].instream.pipe(json2csv).pipe(streamArr['Push Notification'].outstream)

module.exports = {
    makeStreams,
    pushStream,
    setup
}

0 个答案:

没有答案