我正在使用JavaCV的FFmpegFrameGrabber
从视频文件中检索帧。此FFmpegFrameGrabber
返回Frame
,其中基本上包含Buffer[]
来保存视频帧的图像像素。
由于性能是我的首要任务,因此我想使用OpenGL ES直接显示此Buffer[]
,而不将其转换为Bitmap
。
要显示的视图只占用不到一半的屏幕,并且遵循OpenGL ES document:
想要在布局的一小部分中加入OpenGL ES图形的开发人员应该看一下TextureView。
所以我猜TextureView
是这项任务的正确选择。但是我没有找到太多关于此的资源(大多数是Camera Preview示例)。
我想问一下如何将Buffer[]
绘制到TextureView
?如果这不是最有效的方法,我愿意尝试你的替代方案。
更新:所以目前我的设置如下:
在我的VideoActivity
中,我反复提取包含Frame
的视频ByteBuffer
,然后将其发送到我的MyGLRenderer2
以转换为OpenGLES的纹理:
...
mGLSurfaceView = (GLSurfaceView)findViewById(R.id.gl_surface_view);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new MyGLRenderer2(this);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
...
private void grabCurrentFrame(final long currentPosition){
if(mCanSeek){
new AsyncTask(){
@Override
protected void onPreExecute() {
super.onPreExecute();
mCanSeek = false;
}
@Override
protected Object doInBackground(Object[] params) {
try {
Frame frame = mGrabber.grabImage();
setCurrentFrame((ByteBuffer)frame.image[0]);
}
catch (Exception e) {
e.printStackTrace();
}
return null;
}
@Override
protected void onPostExecute(Object o) {
super.onPostExecute(o);
mCanSeek = true;
}
}
}.execute();
}
}
private void setCurrentFrame(ByteBuffer buffer){
mRenderer.setTexture(buffer);
}
MyGLRenderer2
看起来像这样:
public class MyGLRenderer2 implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer2";
private FullFrameTexture mFullFrameTexture;
public MyGLRenderer2(Context context){
super();
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0,0,width, height);
GLES20.glClearColor(0,0,0,1);
mFullFrameTexture = new FullFrameTexture();
}
@Override
public void onDrawFrame(GL10 gl) {
createFrameTexture(mCurrentBuffer, 1280, 720, GLES20.GL_RGB); //should not need to be a power of 2 since I use GL_CLAMP_TO_EDGE
mFullFrameTexture.draw(textureHandle);
if(mCurrentBuffer != null){
mCurrentBuffer.clear();
}
}
//test
private ByteBuffer mCurrentBuffer;
public void setTexture(ByteBuffer buffer){
mCurrentBuffer = buffer.duplicate();
mCurrentBuffer.position(0);
}
private int[] textureHandles = new int[1];
private int textureHandle;
public void createFrameTexture(ByteBuffer data, int width, int height, int format) {
GLES20.glGenTextures(1, textureHandles, 0);
textureHandle = textureHandles[0];
GlUtil.checkGlError("glGenTextures");
// Bind the texture handle to the 2D texture target.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);
// Configure min/mag filtering, i.e. what scaling method do we use if what we're rendering
// is smaller or larger than the source image.
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GlUtil.checkGlError("loadImageTexture");
// Load the data from the buffer into the texture handle.
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, /*level*/ 0, format,
width, height, /*border*/ 0, format, GLES20.GL_UNSIGNED_BYTE, data);
GlUtil.checkGlError("loadImageTexture");
}
}
FullFrameTexture
看起来像这样:
public class FullFrameTexture {
private static final String VERTEXT_SHADER =
"uniform mat4 uOrientationM;\n" +
"uniform mat4 uTransformM;\n" +
"attribute vec2 aPosition;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_Position = vec4(aPosition, 0.0, 1.0);\n" +
"vTextureCoord = (uTransformM * ((uOrientationM * gl_Position + 1.0) * 0.5)).xy;" +
"}";
private static final String FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D sTexture;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}";
private final byte[] FULL_QUAD_COORDINATES = {-1, 1, -1, -1, 1, 1, 1, -1};
private ShaderProgram shader;
private ByteBuffer fullQuadVertices;
private final float[] orientationMatrix = new float[16];
private final float[] transformMatrix = new float[16];
public FullFrameTexture() {
if (shader != null) {
shader = null;
}
shader = new ShaderProgram(EglUtil.getInstance());
shader.create(VERTEXT_SHADER, FRAGMENT_SHADER);
fullQuadVertices = ByteBuffer.allocateDirect(4 * 2);
fullQuadVertices.put(FULL_QUAD_COORDINATES).position(0);
Matrix.setRotateM(orientationMatrix, 0, 0, 0f, 0f, 1f);
Matrix.setIdentityM(transformMatrix, 0);
}
public void release() {
shader = null;
fullQuadVertices = null;
}
public void draw(int textureId) {
shader.use();
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
int uOrientationM = shader.getAttributeLocation("uOrientationM");
int uTransformM = shader.getAttributeLocation("uTransformM");
GLES20.glUniformMatrix4fv(uOrientationM, 1, false, orientationMatrix, 0);
GLES20.glUniformMatrix4fv(uTransformM, 1, false, transformMatrix, 0);
// Trigger actual rendering.
renderQuad(shader.getAttributeLocation("aPosition"));
shader.unUse();
}
private void renderQuad(int aPosition) {
GLES20.glVertexAttribPointer(aPosition, 2, GLES20.GL_BYTE, false, 0, fullQuadVertices);
GLES20.glEnableVertexAttribArray(aPosition);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
现在我可以在应用程序崩溃之前显示一些框架(非常错误的颜色)。
答案 0 :(得分:2)
执行所要求的最有效方法是将像素转换为OpenGL ES纹理,并在TextureView上渲染。要使用的函数是glTexImage2D()
。
您可以在Grafika中找到一些示例,它使用该函数上传一些生成的纹理。看看createImageTexture()
。如果您的应用中没有GLES代码,Grafika的gles
套餐可能会有用。
FWIW,将视频帧直接解码到使用TextureView的SurfaceTexture创建的Surface会更高效,但我不知道JavaCV是否支持它。
修改:如果您不介意使用NDK,另一种方法是使用ANativeWindow。 Create a Surface用于TextureView的SurfaceTexture,将其传递给本机代码,然后调用ANativeWindow_fromSurface()
以获取ANativeWindow。使用ANativeWindow_setBuffersGeometry()
设置大小和颜色格式。锁定缓冲区,复制像素,解锁缓冲区以发布它。我不认为这需要在内部进行额外的数据复制,并且可能比glTexImage2D()
方法有一些优势。