我使用MediaCodec(和SurfaceView)在Android上渲染h.264视频。 以下是我的一些代码。
@Override
public void surfaceCreated(SurfaceHolder holder) {
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if (videoPlayer == null) {
videoPlayer = new PlayerThread(holder.getSurface());
videoPlayer.start();
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (videoPlayer != null) {
videoPlayer.interrupt();
videoPlayer.isEOS = true;
videoPlayer = null;
}
}
private class PlayerThread extends Thread {
private MediaCodec decoder;
private Surface surface;
private final static String mimeType = "video/avc";
public boolean isEOS = false;
public PlayerThread(Surface surface) {
this.surface = surface;
}
@Override
public void run() {
MediaFormat format = MediaFormat.createVideoFormat(mimeType, frame.width, frame.height);
decoder = MediaCodec.createDecoderByType(mimeType);
decoder.configure(format, surface, null, 0);
decoder.start();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
int startPTS = 0;
while (!Thread.interrupted() && !isEOS) {
frame = frameReader.nextFrame();
if (startPTS == 0) {
startPTS = frame.pts;
}
int relativePTS = frame.pts - startPTS;
int inIndex = decoder.dequeueInputBuffer(-1);
if (inIndex >= 0)
{
ByteBuffer inputBuffer = inputBuffers[inIndex];
inputBuffer.clear();
inputBuffer.put(frame.buf, 0, frame.size);
decoder.queueInputBuffer(inIndex, 0, frame.size, relativePTS*1000, 0);
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outIndex = decoder.dequeueOutputBuffer(info, -1);
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = decoder.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer outputBuffer = outputBuffers[outIndex];
Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + outputBuffer);
decoder.releaseOutputBuffer(outIndex, true);
}
}
/* clean */
decoder.stop();
decoder.release();
decoder = null;
}
}
我设置了PTS,但它不起作用。视频播放速度非常快。
在这种情况下,有谁知道如何设置正确的演示文稿时间戳? 任何帮助将不胜感激。
答案 0 :(得分:2)
使用MediaCodec解码视频时,您不是PTS的设置,而是接收 PTS的那个。当您使用"渲染"来呼叫releaseOutputBuffer()
时标志设置,您告诉系统尽快渲染帧。你有责任调整框架。
有关控制播放率的基于MediaCodec的视频播放器的示例,请参阅Grafika,特别是使用SpeedControlCallback类。
答案 1 :(得分:0)
在您的示例中,您应该考虑使用info.mPresentationTimeUs
作为decoder.dequeueOutputBuffer(info, -1);
的一部分收到的AV Sync
。收到输出帧后,您应该使用可能为audio
时钟或system
时钟的参考时钟执行Sends an alert if the number of accesses to B-tree indexes meets the specified average.
。
请参阅此link以获取一个很好的参考作为示例。