提前致谢。
我想同时从rtsp摄像机录制视频 通过新样本信号处理从appsink获得的视频帧。 然后,在另一个应用程序中,我读取录制的视频并显示与处理的帧相关的信息。
文档说缓冲区>偏移量有视频帧号,但对我来说不起作用,它总是有相同的数字。
我有这个管道:
rtspsrc location=rtsp://10.0.0.1:554/video.sdp latency=100 ! rtph264depay ! tee name=t
! queue ! vaapidecodebin ! vaapipostproc format=rgba ! appsink name=appsink t.
! queue ! h264parse ! mp4mux ! filesink sync=false name=filer location=/home/VideoDB/2017-09-04_16:33:46.mp4
代码示例:
GstFlowReturn GstVideoSourcePrivate::newSample(GstAppSink* sink, gpointer user_data)
{
....
GstSample* sinkSample = gst_app_sink_pull_sample(GST_APP_SINK(sink));
if (sinkSample) {
GstBuffer* buffer = gst_sample_get_buffer(sinkSample);
// I need this position to be the same as the recorded video
// or get the frame video sequence number, so that we
GstClockTime pos;
gst_element_query_position(self->pipeline(), GST_FORMAT_TIME, &pos);
...
}
...
}
答案 0 :(得分:0)
感谢您的回答。
我做了你告诉我的事,但我无法得到预期的结果。
然后我发现当 videorate 元素插入管道时,buffer-> offset开始显示正确的视频帧序列。但同样,我无法在几毫秒内获得良好的同步。
所以,我再次阅读文档,并制作了这段代码以获得更好的结果。似乎很少有延迟需要补偿。
https://gstreamer.freedesktop.org/documentation/application-development/advanced/clocks.html
https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/clock.html
...
int64_t timestam = GST_BUFFER_TIMESTAMP(buffer);
GstSegment* segment = gst_sample_get_segment(sinkSample);
gint64 pos = gst_segment_to_stream_time(segment, GST_FORMAT_TIME, timestam);
GstQuery*q = gst_query_new_latency();
if (gst_element_query (self->m_pipeline, q)) {
gboolean live;
GstClockTime minlat, maxlat;
gst_query_parse_latency (q, &live, &minlat, &maxlat);
pos+= minlat;
}
gst_query_unref (q);
...