2个流:
具有可读性streams stream1
和stream2
,获取包含stream1
和stream2
连接的流的惯用(简明)方法是什么? /强>
我不能stream1.pipe(outStream); stream2.pipe(outStream)
,因为那时流内容混杂在一起。
n 信息流:
给定EventEmitter发出不确定数量的流,例如
eventEmitter.emit('stream', stream1)
eventEmitter.emit('stream', stream2)
eventEmitter.emit('stream', stream3)
...
eventEmitter.emit('end')
获取所有流连接在一起的流
答案 0 :(得分:12)
combined-stream包连接流。自述文件中的示例:
var CombinedStream = require('combined-stream');
var fs = require('fs');
var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));
combinedStream.pipe(fs.createWriteStream('combined.txt'));
我相信你必须立即附加所有流。如果队列为空,则combinedStream
会自动结束。请参阅issue #5。
stream-stream库是一种具有明确.end
的替代品,但它不太受欢迎,并且可能没有经过充分测试。它使用Node 0.10的streams2 API(参见this discussion)。
答案 1 :(得分:8)
这可以用vanilla nodejs
完成import { PassThrough } from 'stream'
const merge = (...streams) => {
let pass = new PassThrough()
let waiting = streams.length
for (let stream of streams) {
pass = stream.pipe(pass, {end: false})
stream.once('end', () => --waiting === 0 && pass.emit('end'))
}
return pass
}
答案 2 :(得分:8)
const {PassThrough} = require('stream')
let joined = [s0, s1, s2, ...sN].reduce((pt, s, i, a) => {
s.pipe(pt, {end: false})
s.once('end', () => a.every(s => s.ended) && pt.emit('end'))
return pt
}, new PassThrough())
干杯;)
答案 3 :(得分:3)
你或许可以使它更简洁,但这是有效的:
var util = require('util');
var EventEmitter = require('events').EventEmitter;
function ConcatStream(streamStream) {
EventEmitter.call(this);
var isStreaming = false,
streamsEnded = false,
that = this;
var streams = [];
streamStream.on('stream', function(stream){
stream.pause();
streams.push(stream);
ensureState();
});
streamStream.on('end', function() {
streamsEnded = true;
ensureState();
});
var ensureState = function() {
if(isStreaming) return;
if(streams.length == 0) {
if(streamsEnded)
that.emit('end');
return;
}
isStreaming = true;
streams[0].on('data', onData);
streams[0].on('end', onEnd);
streams[0].resume();
};
var onData = function(data) {
that.emit('data', data);
};
var onEnd = function() {
isStreaming = false;
streams[0].removeAllListeners('data');
streams[0].removeAllListeners('end');
streams.shift();
ensureState();
};
}
util.inherits(ConcatStream, EventEmitter);
我们通过streams
(流的队列;后面的push
和前面的shift
),isStreaming
和{{1}跟踪状态}。当我们得到一个新流时,我们推送它,当流结束时,我们停止收听并转移它。当流的流结束时,我们设置streamsEnded
。
在每个事件中,我们检查我们所处的状态。如果我们已经流式传输(管道流),我们什么都不做。如果队列为空并且设置了streamsEnded
,则会发出streamsEnded
事件。如果队列中有东西,我们会恢复它并听取它的事件。
*请注意,end
和pause
是建议性的,因此某些流可能无法正常运行,并且需要缓冲。这项练习留待读者阅读。
完成所有这些操作后,我会通过构建resume
,使用n=2
创建EventEmitter
并发出两个ConcatStream
事件,然后执行stream
事件。我相信它可以更简洁地完成,但我们也可以使用我们所拥有的。
答案 4 :(得分:3)
https://github.com/joepie91/node-combined-stream2是组合Streams2兼容的替代组合流模块(如上所述。)它会自动包装Streams1流。
combined-stream2的示例代码:
var CombinedStream = require('combined-stream2');
var fs = require('fs');
var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));
combinedStream.pipe(fs.createWriteStream('combined.txt'));
答案 5 :(得分:1)
streamee.js是一组基于节点1.0+流的流转换器和编写器,包含一个连接方法:
var stream1ThenStream2 = streamee.concatenate([stream1, stream2]);
答案 6 :(得分:1)
这里两个最受好评的答案都不适用于异步流,因为它们只是通过管道传递内容,而不管源流是否准备好产生。我必须将内存中的字符串流与来自数据库的数据馈送结合起来,并且数据库内容始终位于结果流的末尾,因为要花一秒钟的时间才能获得数据库响应。这就是我出于目的而写的。
export function joinedStream(...streams: Readable[]): Readable {
function pipeNext(): void {
const nextStream = streams.shift();
if (nextStream) {
nextStream.pipe(out, { end: false });
nextStream.on('end', function() {
pipeNext();
});
} else {
out.end();
}
}
const out = new PassThrough();
pipeNext();
return out;
}
答案 7 :(得分:1)
现在可以使用异步迭代器轻松完成
async function* concatStreams(readables) {
for (const readable of readables) {
for await (const chunk of readable) { yield chunk }
}
}
您可以像这样使用它
const fs = require('fs')
const stream = require('stream')
const files = ['file1.txt', 'file2.txt', 'file3.txt']
const iterable = await concatStreams(files.map(f => fs.createReadStream(f)))
// convert the async iterable to a readable stream
const mergedStream = stream.Readable.from(iterable)
有关异步迭代器的更多信息:https://2ality.com/2019/11/nodejs-streams-async-iteration.html
答案 8 :(得分:0)
在使用ECMA 15+的香草nodejs中,结合了 Ivo 和 Feng 的良好答案。
PassThrough
类是一个普通的Transform
流,它不会以任何方式修改该流。
const { PassThrough } = require('stream');
const concatStreams = (streamArray, streamCounter = streamArray.length) => streamArray
.reduce((mergedStream, stream) => {
// pipe each stream of the array into the merged stream
// prevent the automated 'end' event from firing
mergedStream = stream.pipe(mergedStream, { end: false });
// rewrite the 'end' event handler
// Every time one of the stream ends, the counter is decremented.
// Once the counter reaches 0, the mergedstream can emit its 'end' event.
stream.once('end', () => --streamCounter === 0 && mergedStream.emit('end'));
return mergedStream;
}, new PassThrough());
可以这样使用:
const mergedStreams = concatStreams([stream1, stream2, stream3]);
答案 9 :(得分:0)
以下代码对我有用:)。从先前给出的所有答案中获取了输入
const pipeStreams = (streams) => {
const out = new PassThrough()
// Piping the first stream to the out stream
// Also prevent the automated 'end' event of out stream from firing
streams[0].pipe(out, { end: false })
for (let i = 0; i < streams.length - 2; i++) {
// On the end of each stream (until the second last) pipe the next stream to the out stream
// Prevent the automated 'end' event of out stream from firing
streams[i].on('end', () => {
streams[i + 1].pipe(out, { end: false })
})
}
// On the end of second last stream pipe the last stream to the out stream.
// Don't prevent the 'end flag from firing'
streams[streams.length - 2].on('end', () => {
streams[streams.length - 1].pipe(out)
})
return out
}