结合autobahn websockets,gstreamers和html5 mediaSource API的实现

时间:2015-03-11 21:14:08

标签: websocket html5-video autobahn webm vp8

我使用autobahn | python运行websocket服务器。在服务器端,我还运行了gstreamer管道,我用它来使用" appsink"来捕获webm帧。 实现的gstreamer管道是:

gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! videoconvert ! vp8enc ! webmmux ! appsink name="sink"

每次,我都会在appsink中收到一个缓冲区,我通过websocket发送它作为二进制文件" message"使用sendMessage。

def on_new_buffer(appsink):
    global once
    gstsample = appsink.emit('pull-sample')
    gstbuffer = gstsample.get_buffer()
    frame_data = gstbuffer.extract_dup(0,gstbuffer.get_size())
    for c in global_clients:
        c.sendMessage(frame_data,True)
        print("Directly sent: {0} bytes".format(len(frame_data)))

    return False

在客户端,我有一个复杂的收到的frame_data blob流。 有一个FileReader,MediaSource和源缓冲区。每当接收到frame_data时,使用filereader将其读取为缓冲区。如果文件读取器忙于读取前一个frame_data,它会将它附加到" buffer_pool"。一旦frame_data被读取为缓冲区,它将附加到" sourceBuffer"。如果" sourceBuffer"仍在更新上一个块,它将附加到" sourceBufferpool"。

    <script>
    var video = document.getElementById('v');
    var playButton = document.getElementById('playbutton');
    var mediaSource;
    var sourceBuffer;
    var buffer_pool = [];
    var sourceBufferpool = [];

    function setupVideo() {
        window.MediaSource = window.MediaSource || window.WebKitMediaSource;
        if (!!!window.MediaSource) {
            alert('MediaSource API is not available');
        }
        mediaSource = new MediaSource();
        video.src = window.URL.createObjectURL(mediaSource);
        mediaSource.addEventListener('sourceopen', function (e) {
            try {
                sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
            } catch(e) {
                console.log('Exception calling addSourceBuffer for video', e);
                return;
            }

            //sourceBuffer.addEventListener('updatestart', function(e) { console.log('updatestart: ' + e.target + mediaSource.readyState); });
            //sourceBuffer.addEventListener('updateend', function(e) { console.log('updateend: ' + e.target + mediaSource.readyState); });
            sourceBuffer.addEventListener('error', function(e) { console.log('error: ' + e.target + mediaSource.readyState); });
            sourceBuffer.addEventListener('abort', function(e) { console.log('abort: ' + e.target + mediaSource.readyState); });

            sourceBuffer.addEventListener('update', function() {
                if (sourceBufferpool.length > 0 && !sourceBuffer.updating) {
                    try {
                        sourceBuffer.appendBuffer(sourceBufferpool.shift());
                        console.log('update: pooled buffer appended ' + sourceBufferpool.length + mediaSource.readyState);
                    }catch(e){
                        console.log('Exception calling appendBuffer for video ', e);
                        return;
                    }
                }
            },false)

            if (video.paused) {
                video.play()
            }

            startWSStreaming();
        },false)

        mediaSource.addEventListener('sourceended', function(e) { console.log('sourceended: ' + mediaSource.readyState); });
        mediaSource.addEventListener('sourceclose', function(e) { console.log('sourceclose: ' + mediaSource.readyState); });
        mediaSource.addEventListener('error', function(e) { console.log('error: ' + mediaSource.readyState); });

    }

    function startWSStreaming() {
        var reader = new FileReader();

        reader.onload = function (evt) {
            if (sourceBuffer.updating || sourceBufferpool.length > 0){
                sourceBufferpool.push(new Uint8Array(evt.target.result));
                console.log('update: pooled buffer appended ' + sourceBufferpool.length + mediaSource.readyState);
            }else{
                sourceBuffer.appendBuffer(new Uint8Array(evt.target.result));
                console.log('update: direct buffer appended ' + sourceBufferpool.length + mediaSource.readyState);
            }
        }

        reader.onloadend = function (evt) {
            if (buffer_pool.length > 0) {
                var chunk = new Blob([buffer_pool.shift()], {type: 'video/webm'});
                evt.target.readAsArrayBuffer(chunk);
                console.log('Processed buffer pool: current size ' + buffer_pool.length);
            }
        }

        ws = new WebSocket("ws://localhost:9000/");
        ws.onopen = function () {
            document.getElementById("MSG1").innerHTML = 'Websocket opened <br>';
        }
        ws.onmessage = function(e) {
            myBuffer = e.data;
            if (reader.readyState == 1 || buffer_pool.length > 0) {
                buffer_pool.push(myBuffer);
                console.log('Received buffer pooled: current size ' + buffer_pool.length);
            }else{
                var chunk = new Blob([myBuffer], {type: 'video/webm'});
                reader.readAsArrayBuffer(chunk);
                console.log('First buffer processed');
            }
        }

    }       
</script>

现在,最终结果是,我在浏览器窗口中只看到一个框架,然后视频冻结。检查chrome:// media-internals /后,我得到以下线索:

Timestamp   Property    Value
00:00:00 00 pipeline_state  kCreated
00:00:00 00 EVENT   PIPELINE_CREATED
00:00:00 00 EVENT   WEBMEDIAPLAYER_CREATED
00:00:00 00 url blob:http%3A//localhost%3A8080/09060a78-9759-4fcd-97a2-997121ba6122
00:00:00 00 pipeline_state  kInitDemuxer
00:00:01 668    duration    unknown
00:00:01 669    pipeline_state  kInitVideoRenderer
00:00:01 685    pipeline_state  kPlaying
00:00:03 820    EVENT   PLAY
00:00:04 191    error   Got a block with a timecode before the previous block.
00:00:04 191    pipeline_error  pipeline: decode error
00:00:04 191    pipeline_state  kStopping
00:00:04 192    pipeline_state  kStopped
00:00:28 483    EVENT   WEBMEDIAPLAYER_DESTROYED
p,很长的描述!!!我希望你已经做到这一点。 现在,真正的问题:

  1. 为什么视频在浸入一帧之后就会冻结?
  2. 是因为websocket&#34; sendMessage&#34;方法,因为我将webm块视为不同的消息,而这应该需要处理&#34; sendMessageFrameData&#34;?
  3. 我是否需要对到达的frame_data进行一些排序,以便按照发送的顺序接收它们?
  4. 或者我的整个方法不正确?
  5. 请帮忙!

0 个答案:

没有答案