gstreamer,在winforms(和WPF)中渲染rtspsrc

时间:2018-04-25 14:24:03

标签: c# wpf winforms gstreamer

我尝试编写的应用程序根据网络上的视频服务器获取视频流,并在winforms窗口中显示(后来我希望在WPF中托管相同类型的控件)。我正在使用gstreamer-sharp,因为我的应用程序基于c#.net。

我成功地根据this answer中的代码示例运行了videotestsrc,并且能够使用VideoOverlayAdapter和一组winForms面板创建窗口中所需的testvideosrc的多个实例。

当我开始让rtspsrc做同样的事情时,我自然遇到了一些我想要过去的障碍,我班上的代码就在下面。

我没有在初始化代码中链接rtspsrc,而是相信我需要将rtspsrc的新pad链接到下一个元素(在本例中为rtph264depay),这就是我遇到麻烦的地方。

PadAdded事件似乎有时会在启动程序的几秒钟内被解雇,有时根本不会被解雇?服务器可以使用gstreamer-sharp版本的基本教程(第1部分),并具有良好的延迟(容易小于300毫秒,但我需要在应用程序运行后进行玻璃到玻璃测试)。

一旦PadAdded事件最终触发,我在尝试将新打击垫链接到rtph264depay接收器垫时会获得NOFORMAT状态。

我还注意到我似乎没有得到准备窗口处理总线同步消息,我将像gstVideoOverlay示例中那样设置视频覆盖适配器(所以我不会输出到我需要的窗口句柄如果垫连接成功)。

我一直无法找到这个特殊问题(rtspsrc pad没有链接到rtph264depay接收器),因为类似的问题似乎是关于将其他元素链接在一起。

根据调试消息,初始化代码中其余元素的初始链接是成功的。

最终目标是将帧放入OpenCV / Emgu并进行一些分析和基本覆盖工作。

对此的任何帮助将不胜感激。

非常感谢!

/// <summary>
/// class to create a gstreamer pipeline based on an rtsp stream at the provided URL
/// </summary>
class gstPipeline2
{
    // elements for the pipeline
    private Element rtspsrc, rtph264depay, decoder, videoConv, videoSink;
    private System.Threading.Thread mainGLibThread;
    private GLib.MainLoop mainLoop;

    // the window handle (passed in)
    private IntPtr windowHandle;
    // our pipeline
    private Pipeline currentPipeline = null;

    /// <summary>
    /// Create a new gstreamer pipeline rendering the stream at URL into the provided window handle 
    /// </summary>
    /// <param name="WindowHandle">The handle of the window to render to </param>
    /// <param name="Url">The url of the video stream</param>
    public gstPipeline2(IntPtr WindowHandle, string Url)
    {
        windowHandle = WindowHandle;    // get the handle and save it locally

        // initialise the gstreamer library and associated threads (for diagnostics)
        Gst.Application.Init(); 
        mainLoop = new GLib.MainLoop();
        mainGLibThread = new System.Threading.Thread(mainLoop.Run);
        mainGLibThread.Start();

        // create each element now for the pipeline
        // starting with the rtspsrc
        rtspsrc = ElementFactory.Make("rtspsrc", "udpsrc0");  // create an rtsp source
        rtspsrc["location"] = Url;   // and set its location (the source of the data)
        rtph264depay = ElementFactory.Make("rtph264depay", "rtph264depay0");    
        decoder = ElementFactory.Make("avdec_h264", "decoder0");    
        videoConv = ElementFactory.Make("videoconvert", "videoconvert0");   
        videoSink = ElementFactory.Make("autovideosink", "sink0");  // and finally the sink to render the video (redirected to the required window handle below in Bus_SyncMessage() ) 

        // create our pipeline which links all the elements together into a valid data flow
        currentPipeline = new Pipeline("pipeline");
        currentPipeline.Add(rtspsrc, rtph264depay, decoder, videoConv, videoSink); // add the required elements into it

        // link the various bits together in the correct order
        if(!rtph264depay.Link(decoder))
            System.Diagnostics.Debug.WriteLine("rtph264depay could not be linked to decoder (bad)");
        else
            System.Diagnostics.Debug.WriteLine("rtph264depay linked to decoder (good)");

        if (!decoder.Link(videoConv))
            System.Diagnostics.Debug.WriteLine("decoder could not be linked to videoconvert (bad)");
        else
            System.Diagnostics.Debug.WriteLine("decoder linked to videoconvert (good)");

        if (!videoConv.Link(videoSink))
            System.Diagnostics.Debug.WriteLine("videoconvert could not be linked to autovideosink (bad)");
        else
            System.Diagnostics.Debug.WriteLine("videoconvert linked to autovideosink (good)");

        rtspsrc.PadAdded += Rtspsrc_PadAdded; // subscribe to the PadAdded event so we can link new pads (sources of data?) to the depayloader when they arrive

        // subscribe to the messaging system of the bus and pipeline so we can minotr status as we go
        Bus bus = currentPipeline.Bus;
        bus.AddSignalWatch();
        bus.Message += Bus_Message;

        bus.EnableSyncMessageEmission();
        bus.SyncMessage += Bus_SyncMessage;

        // finally set the state of the pipeline running so we can get data
        var setStateReturn = currentPipeline.SetState(State.Null);
        System.Diagnostics.Debug.WriteLine("SetStateNULL returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Ready);
        System.Diagnostics.Debug.WriteLine("SetStateReady returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Playing);
        System.Diagnostics.Debug.WriteLine("SetStatePlaying returned: " + setStateReturn.ToString());
    }

    private void Rtspsrc_PadAdded(object o, PadAddedArgs args)
    {
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: called with new pad named: " + args.NewPad.Name);

        // a pad has been added to the source so we need to link it to the rest of the pipeline to ultimately display it onscreen
        Pad sinkPad = rtph264depay.GetStaticPad("sink");   // get the sink pad for the one we have recieved  so we can link to the depayloader element
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: rtps264depay sink pad returned: " + sinkPad.Name);

        PadLinkReturn ret = args.NewPad.Link(sinkPad);
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: link attempt returned: " + ret.ToString());
    }

    public void killProcess()
    {
        mainLoop.Quit();
    }

    private void Bus_SyncMessage(object o, SyncMessageArgs args)
    {
        if (Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(args.Message))
        {
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Message prepare window handle received by: " + args.Message.Src.Name + " " + args.Message.Src.GetType().ToString());

            if (args.Message.Src != null)
            {
                // these checks were in the testvideosrc example and failed, args.Message.Src is always Gst.Element???
                if (args.Message.Src is Gst.Video.VideoSink)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is VideoSink");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT VideoSink");

                if (args.Message.Src is Gst.Bin)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is Bin");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT Bin");

                try
                {
                    args.Message.Src["force-aspect-ratio"] = true;
                }
                catch (PropertyNotFoundException) { }

                try
                {
                    Gst.Video.VideoOverlayAdapter adapter = new VideoOverlayAdapter(args.Message.Src.Handle);
                    adapter.WindowHandle = windowHandle;
                    adapter.HandleEvents(true);
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Handle passed to adapter: " + windowHandle.ToString());
                }
                catch (Exception ex) { System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Exception Thrown (overlay stage): " + ex.Message); }
            }
        }
        else
        {
            string info;
            IntPtr prt;
            args.Message.ParseInfo(out prt, out info);
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: " + args.Message.Type.ToString() + " - " + info);
        }
    }

    private void Bus_Message(object o, MessageArgs args)
    {
        var msg = args.Message;
        //System.Diagnostics.Debug.WriteLine("HandleMessage received msg of type: {0}", msg.Type);
        switch (msg.Type)
        {
            case MessageType.Error:
                //
                GLib.GException err;
                string debug;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Error received: " + msg.ToString());
                break;
            case MessageType.StreamStatus:
                Gst.StreamStatusType status;
                Element theOwner;
                msg.ParseStreamStatus(out status, out theOwner);
                System.Diagnostics.Debug.WriteLine("Bus_Message: Case SteamingStatus: status is: " + status + " ; Owner is: " + theOwner.Name);
                break;
            case MessageType.StateChanged:
                State oldState, newState, pendingState;
                msg.ParseStateChanged(out oldState, out newState, out pendingState);
                if (newState == State.Paused)
                    args.RetVal = false;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Pipeline state changed from {0} to {1}: ; Pending: {2}", Element.StateGetName(oldState), Element.StateGetName(newState), Element.StateGetName(pendingState));
                break;
            case MessageType.Element:
                System.Diagnostics.Debug.WriteLine("Bus_Message: Element message: {0}", args.Message.ToString());
                break;
            default:
                System.Diagnostics.Debug.WriteLine("Bus_Message: HandleMessage received msg of type: {0}", msg.Type);
                break;
        }
        args.RetVal = true;
    }
}

1 个答案:

答案 0 :(得分:0)

好的,我设法解决了我遇到的问题。

第一个问题(添加的垫没有被一致地调用)似乎是通过构建x64而不是任何cpu或x86来修复的。我怀疑我的gstreamer库的安装没有正确完成。

第二个问题(连接新垫时为NOFORMAT)需要更多工作。最后,我遵循了Florian的建议,并查看了使用uridecodebin作为源,并将新垫直接链接到autovideosink ....中间没有元素。

我现在添加了一个一致的新pad,每次都会发送准备窗口处理总线同步消息。我有四个独立的IP流,现在有四个winforms面板具有良好的延迟(仍然是测试玻璃到玻璃)。

为了确保延迟(有点)调整,我必须深入研究uridecodebin的源设置信号,并假设源是rtspsrc类型,然后设置它的“延迟”属性。下面的代码没有验证源类型,所以YMMV,你可能会在这里得到一个例外。

请参阅以下适用于我的课程的源代码(编译为x64)。

希望这对那里的任何人都有所帮助。

现在进入appsink !! :)

/// <summary>
/// class to create a gstreamer pipeline based on an rtsp stream at the provided URL
/// </summary>
class gstPipeline2
{
    // elements for the pipeline
    private Element uriDecodeBin, videoSink;
    private System.Threading.Thread mainGLibThread;
    private GLib.MainLoop mainLoop;

    // the window handle (passed in)
    private IntPtr windowHandle;
    // our pipeline
    private Pipeline currentPipeline = null;

    /// <summary>
    /// Create a new gstreamer pipeline rendering the stream at URL into the provided window handle 
    /// </summary>
    /// <param name="WindowHandle">The handle of the window to render to </param>
    /// <param name="Url">The url of the video stream</param>
    public gstPipeline2(string Url, IntPtr WindowHandle)
    {
        windowHandle = WindowHandle;    // get the handle and save it locally

        // initialise the gstreamer library and associated threads (for diagnostics)
        Gst.Application.Init();

        mainLoop = new GLib.MainLoop();
        mainGLibThread = new System.Threading.Thread(mainLoop.Run);
        mainGLibThread.Start();

        // create each element now for the pipeline
        uriDecodeBin = ElementFactory.Make("uridecodebin", "uriDecodeBin0");  // create an uridecodebin (which handles most of the work for us!!)
        uriDecodeBin["uri"] = Url;   // and set its location (the source of the data)
        videoSink = ElementFactory.Make("autovideosink", "sink0");  // and finally the sink to render the video (redirected to the required window handle below in Bus_SyncMessage() ) 

        // create our pipeline which links all the elements together into a valid data flow
        currentPipeline = new Pipeline("pipeline");
        currentPipeline.Add(uriDecodeBin, videoSink); // add the required elements into it

        uriDecodeBin.PadAdded += uriDecodeBin_PadAdded; // subscribe to the PadAdded event so we can link new pads (sources of data?) to the depayloader when they arrive
        uriDecodeBin.Connect("source-setup", SourceSetup);  // subscribe to the "source-setup" signal, not quite done in the usual C# eventing way but treat it as essentially the same

        // subscribe to the messaging system of the bus and pipeline so we can monitor status as we go
        Bus bus = currentPipeline.Bus;
        bus.AddSignalWatch();
        bus.Message += Bus_Message;

        bus.EnableSyncMessageEmission();
        bus.SyncMessage += Bus_SyncMessage;

        // finally set the state of the pipeline running so we can get data
        var setStateReturn = currentPipeline.SetState(State.Null);
        System.Diagnostics.Debug.WriteLine("SetStateNULL returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Ready);
        System.Diagnostics.Debug.WriteLine("SetStateReady returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Playing);
        System.Diagnostics.Debug.WriteLine("SetStatePlaying returned: " + setStateReturn.ToString());
    }

    private void uriDecodeBin_PadAdded(object o, PadAddedArgs args)
    {
        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: called with new pad named: " + args.NewPad.Name);

        // a pad has been added to the source so we need to link it to the rest of the pipeline to ultimately display it onscreen
        Pad sinkPad = videoSink.GetStaticPad("sink");   // get the pad for the one we have recieved  so we can link to the depayloader element
        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: queue pad returned: " + sinkPad.Name);

        PadLinkReturn ret = args.NewPad.Link(sinkPad);

        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: link attempt returned: " + ret.ToString());
    }

    void SourceSetup(object sender, GLib.SignalArgs args)
    {
        // we need to delve into the source portion of the uridecodebin to modify the "latency" property, need to add some validation here to ensure this is an rtspsrc
        var source = (Element)args.Args[0];
        System.Diagnostics.Debug.WriteLine("SourceSetup: source is named: " + source.Name + ", and is of type: " + source.NativeType.ToString());
        source["latency"] = 0;  // this COULD throw an exception if the source is not rtspsrc or similar with a "latency" property
    }

    public void killProcess()
    {
        mainLoop.Quit();
    }

    private void Bus_SyncMessage(object o, SyncMessageArgs args)
    {
        if (Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(args.Message))
        {
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Message prepare window handle received by: " + args.Message.Src.Name + " " + args.Message.Src.GetType().ToString());

            if (args.Message.Src != null)
            {
                // these checks were in the testvideosrc example and failed, args.Message.Src is always Gst.Element???
                if (args.Message.Src is Gst.Video.VideoSink)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is VideoSink");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT VideoSink");

                if (args.Message.Src is Gst.Bin)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is Bin");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT Bin");

                try
                {
                    args.Message.Src["force-aspect-ratio"] = true;
                }
                catch (PropertyNotFoundException) { }

                try
                {
                    Gst.Video.VideoOverlayAdapter adapter = new VideoOverlayAdapter(args.Message.Src.Handle);
                    adapter.WindowHandle = windowHandle;
                    adapter.HandleEvents(true);
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Handle passed to adapter: " + windowHandle.ToString());
                }
                catch (Exception ex) { System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Exception Thrown (overlay stage): " + ex.Message); }
            }
        }
        else
        {
            string info;
            IntPtr prt;
            args.Message.ParseInfo(out prt, out info);
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: " + args.Message.Type.ToString() + " - " + info);
        }
    }

    private void Bus_Message(object o, MessageArgs args)
    {
        var msg = args.Message;
        //System.Diagnostics.Debug.WriteLine("HandleMessage received msg of type: {0}", msg.Type);
        switch (msg.Type)
        {
            case MessageType.Error:
                //
                GLib.GException err;
                string debug;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Error received: " + msg.ToString());
                break;
            case MessageType.StreamStatus:
                Gst.StreamStatusType status;
                Element theOwner;
                msg.ParseStreamStatus(out status, out theOwner);
                System.Diagnostics.Debug.WriteLine("Bus_Message: Case SteamingStatus: status is: " + status + " ; Owner is: " + theOwner.Name);
                break;
            case MessageType.StateChanged:
                State oldState, newState, pendingState;
                msg.ParseStateChanged(out oldState, out newState, out pendingState);
                if (newState == State.Paused)
                    args.RetVal = false;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Pipeline state changed from {0} to {1}: ; Pending: {2}", Element.StateGetName(oldState), Element.StateGetName(newState), Element.StateGetName(pendingState));
                break;
            case MessageType.Element:
                System.Diagnostics.Debug.WriteLine("Bus_Message: Element message: {0}", args.Message.ToString());
                break;
            default:
                System.Diagnostics.Debug.WriteLine("Bus_Message: HandleMessage received msg of type: {0}", msg.Type);
                break;
        }
        args.RetVal = true;
    }
}