将视频流视为暂停播放

时间:2020-01-21 15:10:54

标签: c# ffmpeg libav

我正在开发将多个h264视频流传输到视频墙的应用程序。我正在使用libav / ffmpeg库从应用程序内部一次流式传输多个视频文件。该应用程序将控制播放速度,查找,暂停,恢复,停止,并且视频墙将仅接收udp流。

我想实现流传输,以便在视频暂停时连续发送相同的帧,以便看起来好像是处于暂停状态的视频窗口。

如何将同一h264帧的副本插入流中,以免干扰以后发送帧?

我的代码几乎是来自ffmpeg.exe的transcoding.c的确切端口。计划保留最后一帧/数据包的副本,并在暂停发送时发送。这是否可能正常运行,还是应该以其他方式处理?

while (true)
{
    if (paused) {
        // USE LAST PACKET
    }
    else 
    {
        if ((ret = ffmpeg.av_read_frame(ifmt_ctx, &packet)) < 0)
            break;
    }
    stream_index = packet.stream_index;

    type = ifmt_ctx->streams[packet.stream_index]->codec->codec_type;
    Console.WriteLine("Demuxer gave frame of stream_index " + stream_index);
    if (filter_ctx[stream_index].filter_graph != null)
    {
        Console.WriteLine("Going to reencode&filter the frame\n");
        frame = ffmpeg.av_frame_alloc();
        if (frame == null)
        {
            ret = ffmpeg.AVERROR(ffmpeg.ENOMEM);
            break;
        }

        packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
                ifmt_ctx->streams[stream_index]->time_base,
                ifmt_ctx->streams[stream_index]->codec->time_base,
                AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
        packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
                ifmt_ctx->streams[stream_index]->time_base,
                ifmt_ctx->streams[stream_index]->codec->time_base,
                AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);



        if (type == AVMediaType.AVMEDIA_TYPE_VIDEO)
        {

            ret = ffmpeg.avcodec_decode_video2(stream_ctx[packet.stream_index].dec_ctx, frame,
                &got_frame, &packet); 

        }
        else
        {
            ret = ffmpeg.avcodec_decode_audio4(stream_ctx[packet.stream_index].dec_ctx, frame,
                &got_frame, &packet);
        }
        if (ret < 0)
        {
            ffmpeg.av_frame_free(&frame);
            Console.WriteLine("Decoding failed\n");
            break;
        }
        if (got_frame != 0)
        {
            frame->pts = ffmpeg.av_frame_get_best_effort_timestamp(frame);
            ret = filter_encode_write_frame(frame, (uint)stream_index);
            // SAVE LAST FRAME/PACKET HERE
            ffmpeg.av_frame_free(&frame);
            if (ret < 0)
                goto end;
        }
        else
        {
            ffmpeg.av_frame_free(&frame);
        }
    }
    else
    {
        /* remux this frame without reencoding */
        packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
                ifmt_ctx->streams[stream_index]->time_base,
                ofmt_ctx->streams[stream_index]->time_base,
                AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
        packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
                ifmt_ctx->streams[stream_index]->time_base,
                ofmt_ctx->streams[stream_index]->time_base,
                AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
        ret = ffmpeg.av_interleaved_write_frame(ofmt_ctx, &packet);
        if (ret < 0)
            goto end;
    }
    ffmpeg.av_free_packet(&packet);
}

0 个答案:

没有答案
相关问题