我正在开发一个webrtc应用程序,以流化getDisplayMedia()来捕获整个屏幕,并通过网络传输给多个查看器。
我可以使用简单的Blob创建并将其添加为视频源来正确渲染它,但速度较慢。
因此,为了寻求更快的解决方案,我尝试了MediaSource()
现在,我总是在第一次出现异常后就Uncaught DOMException: Failed to set the 'timestampOffset' property on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
,而且第一次也没有视频。
这是我的摘录:
var video = document.getElementById('video');
var socket = io("http://localhost:3000");
var mediaSource = new MediaSource(),
mediaBuffer,
// init duration of 0 seems fine
duration = 0;
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function(e) {
mediaBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8, vorbis"')
mediaBuffer.addEventListener('update', function() {
// wait for mediaBuffer update to fire before setting the new duration
duration = video.duration;
console.log('updated');
});
mediaBuffer.addEventListener('updateend', function() {
duration = 0;
console.log('updateend');
});
}, false);
socket.on('transfer', function (data) {
mediaBuffer.timestampOffset = duration;
mediaBuffer.appendBuffer(new Uint8Array(data));
video.play();
});
和服务器端代码是:
socket.broadcast.emit('transfer', data);