Rtmp通过gstreamer-1.0 appsrc流式传输到rtmpsink

时间:2016-07-21 04:55:20

标签: c gstreamer rtmp

我正在尝试通过rtmp流式传输我的网络摄像头。我尝试通过以下管道传输数据:

  

gst-launch-1.0 -v v4l2src! '视频/ x-raw,宽度= 640,高度= 480,   帧率= 30/1和#39; !排队!视频转换! omxh264enc! h264parse!   flvmux! rtmpsink location =' rtmp:// {MY_IP} / rtmp / live'

它就像一个魅力。我可以在我的网站上看到该视频。

然后我想首先捕捉帧,并做一些过程。 我通过将数据推送到appsrc并通过管道传输流式处理数据,就像之前一样,但是出现了一些问题。

我在网站上看不到任何流式传输。服务器端和客户端都不会引发任何错误或警告。不过,我仍然可以使用以下方式获取流媒体:

  

gst-launch-1.0 rtmpsrc location =' rtmp:// {MY_IP} / rtmp / live' !文件接收   位置=' rtmpsrca.flv'

有没有人对此有任何想法?

以下是我的网站部分和gstreamer管道的片段

gstreamer管道:



void threadgst(){

    App * app = &s_app; 
    GstCaps *srccap;
    GstCaps * filtercap;
    GstFlowReturn ret;
    GstBus *bus;
    GstElement *pipeline;

    gst_init (NULL,NULL);

    loop = g_main_loop_new (NULL, TRUE);

    //creazione della pipeline:
    pipeline = gst_pipeline_new ("gstreamer-encoder");
    if( ! pipeline ) {
        g_print("Error creating Pipeline, exiting...");
    }

    //creazione elemento appsrc:
    app-> videosrc = gst_element_factory_make ("appsrc", "videosrc");
    if( !  app->videosrc ) {
            g_print( "Error creating source element, exiting...");
    }

    //creazione elemento queue:
    app-> queue = gst_element_factory_make ("queue", "queue");
    if( !  app->queue ) {
            g_print( "Error creating queue element, exiting...");
    }

    app->videocoverter = gst_element_factory_make ("videoconvert", "videocoverter");
    if( ! app->videocoverter ) {
            g_print( "Error creating videocoverter, exiting...");
    }

    //creazione elemento filter:
    app->filter = gst_element_factory_make ("capsfilter", "filter");
    if( ! app->filter ) {
            g_print( "Error creating filter, exiting...");
    }

    app->h264enc = gst_element_factory_make ("omxh264enc", "h264enc");
    if( ! app->h264enc ) {
            g_print( "Error creating omxh264enc, exiting...");
    }

 app->h264parse = gst_element_factory_make ("h264parse", "h264parse");
    if( ! app->h264parse ) {
            g_print( "Error creating h264parse, exiting...");
    }
    app->flvmux = gst_element_factory_make ("flvmux", "flvmux");
    if( ! app->flvmux ) {
            g_print( "Error creating flvmux, exiting...");
    }
    app->rtmpsink = gst_element_factory_make ("rtmpsink", "rtmpsink");
    if( ! app->rtmpsink ) {
            g_print( "Error rtmpsink flvmux, exiting...");
    }



    g_print ("Elements are created\n");
    g_object_set (G_OBJECT (app->rtmpsink), "location" , "rtmp://192.168.3.107/rtmp/live live=1" ,  NULL);
    


    g_print ("end of settings\n");

    srccap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "RGB",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
            //"pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1,
        NULL);

    filtercap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "I420",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
        NULL);

    gst_app_src_set_caps(GST_APP_SRC( app->videosrc), srccap);
    g_object_set (G_OBJECT (app->filter), "caps", filtercap, NULL);
    bus = gst_pipeline_get_bus (GST_PIPELINE ( pipeline));
    g_assert(bus);
    gst_bus_add_watch ( bus, (GstBusFunc) bus_call, app);

 gst_bin_add_many (GST_BIN ( pipeline), app-> videosrc, app->queue, app->videocoverter,app->filter, app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);

    g_print ("Added all the Elements into the pipeline\n");

    int ok = false;
    ok = gst_element_link_many ( app-> videosrc, app->queue, app->videocoverter, app->filter,app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);


    if(ok)g_print ("Linked all the Elements together\n");
    else g_print("*** Linking error ***\n");

    g_assert(app->videosrc);
    g_assert(GST_IS_APP_SRC(app->videosrc));

    g_signal_connect (app->videosrc, "need-data", G_CALLBACK (start_feed), app);
    g_signal_connect (app->videosrc, "enough-data", G_CALLBACK (stop_feed),app);


    g_print ("Playing the video\n");
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    g_print ("Running...\n");
        g_main_loop_run ( loop);

    g_print ("Returned, stopping playback\n");
    gst_element_set_state (pipeline, GST_STATE_NULL);
    gst_object_unref ( bus);
    g_main_loop_unref (loop);
    g_print ("Deleting pipeline\n");


}




我的网页来源



<!DOCTYPE html>
<meta content="text/html;charset=utf-8" http-equiv="Content-Type">
<meta content="utf-8" http-equiv="encoding">
<html>
<head>
<title>Live Streaming</title>

<!-- strobe -->
<script type="text/javascript" src="strobe/lib/swfobject.js"></script>
<script type="text/javascript">
  var parameters = {  
     src: "rtmp://192.168.3.107/rtmp/live",  
     autoPlay: true,  
     controlBarAutoHide: false,  
     playButtonOverlay: true,  
     showVideoInfoOverlayOnStartUp: true,  
     optimizeBuffering : false,  
     initialBufferTime : 0.1,  
     expandedBufferTime : 0.1,  
     minContinuousPlayback : 0.1,  
     //poster: "images/poster.png"  
  };  
  swfobject.embedSWF(
    "strobe/StrobeMediaPlayback.swf"
    , "StrobeMediaPlayback"
    , 1024
    , 768
    , "10.1.0"
    , "strobe/expressInstall.swf"
    , parameters
    , {
      allowFullScreen: "true"
    }
    , {
      name: "StrobeMediaPlayback"
    }
  );
</script>

</head>
<body>
<div id="StrobeMediaPlayback"></div>
</body>
</html>
&#13;
&#13;
&#13;

1 个答案:

答案 0 :(得分:1)

当使用appsrc和appsink时,人们通常会使用缓冲区做一些事情,有时他们会以某种方式获取数据并处理它们,然后创建新缓冲区但忘记正确地为其加上时间戳。

什么是时间戳? 它将时间信息附加到音频/视频缓冲区。 为什么? - 每个应用程序(vlc,web ..)的同步机制,以特定的速率在特定时间内向用户显示(呈现)视频/音频(这是PTS)..

这与帧速率(视频)或频率(音频中有关)有关 - 但时间戳在这里的工作方式不同 - 它不是每个通常有4个字节的音频样本。)

那么在您的网络上可能发生的事情 - 它收到缓冲区但没有这个时间戳信息。所以应用程序不知道如何/何时显示视频,因此它默默地失败并且什么也没显示。

GStreamer应用程序有效,因为它显然有一些算法如何猜测帧速率等。

正如我所说,你有两种选择。

1,用以下方法计算您的PTS和持续时间:

guint64 calculated_pts = some_cool_algorithm();
GstBuffer *buffer = gst_buffer_new(data);//your processed data
GST_BUFFER_PTS(buffer) = calculated_pts; // in nanoseconds
GST_BUFFER_DURATION(buffer) = 1234567890; // in nanoseconds
//push buffer to appsrc

2,或者通过打开do-timestamp来自动生成时间戳的appsrc - 现在我不知道它是怎么回事 - 它要么从大写字母中选择帧率,要么根据你如何将帧推入PTS来生成PTS它