GLSURFACEVIEW保存到PNG - 错误glReadPixels

时间:2013-12-16 01:23:01

标签: android dalvik glsurfaceview glreadpixels

我正在尝试将glsurfaceview的输出呈现给sdCard上的PNG,并且遇到了一些问题。我花了几天时间尝试对类似的SO查询进行排序,这仅仅高于我的专业水平。有人请帮助我通过下面的日志排序,看看我可能会出错。

非常感谢。

日志:

12-16 12:09:18.831: E/AndroidRuntime(29864): FATAL EXCEPTION: GLThread 2712
12-16 12:09:18.831: E/AndroidRuntime(29864): java.nio.BufferUnderflowException
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.Buffer.checkGetBounds(Buffer.java:177)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.DirectByteBuffer.get(DirectByteBuffer.java:66)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.IntToByteBufferAdapter.get(IntToByteBufferAdapter.java:105)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.IntBuffer.get(IntBuffer.java:234)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.grabPixels(GLLayer.java:865)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.saveScreenShot(GLLayer.java:810)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.onDrawFrame(GLLayer.java:794)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1527)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
12-16 12:09:26.849: I/Choreographer(29864): Skipped 478 frames!  The application may be doing too much work on its main thread.

这是我目前的代码: 来自主要活动:

GlobalVariables.setPrint("true");
mView.requestRender();

来自渲染类

public void onDrawFrame(GL10 glUnused ) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    final String vertexShader = getVertexShader();
    final String fragmentShader = getFragmentShader();

    final int vertexShaderHandle = ShaderHelper.compileShader(
            GLES20.GL_VERTEX_SHADER, vertexShader);
    final int fragmentShaderHandle = ShaderHelper.compileShader(
            GLES20.GL_FRAGMENT_SHADER, fragmentShader);

    mProgramHandle = ShaderHelper.createAndLinkProgram(vertexShaderHandle,
            fragmentShaderHandle, new String[] { "a_Position",
                    "a_TexCoordinate" });

    // Set our per-vertex lighting program.
    GLES20.glUseProgram(mProgramHandle);
            // Set program handles for cube drawing. 
    mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle,
            "u_MVPMatrix");
    mTextureUniformHandle0 = GLES20.glGetUniformLocation(mProgramHandle,
            "u_Texture0");
    mTextureUniformHandle1 = GLES20.glGetUniformLocation(mProgramHandle,
            "u_Texture1");

            GLES20.glActiveTexture(GLES20.GL_TEXTURE4);
            // Bind the texture to this unit.
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle4);
            // Tell the texture uniform sampler to use this texture in the shader by
            // binding to texture unit 3.
    GLES20.glUniform1i(mTextureUniformHandle4, 4);


    // Draw some cubes.
    Matrix.setIdentityM(mModelMatrix, 0);


    Matrix.translateM(mModelMatrix, 0, mTransX, mTransY, mAngle*0.05f);
    Matrix.rotateM(mModelMatrix, 0, 0.0f, 1.0f, 1.0f, 0.0f);
    drawCube();   


    int width_surfacea =  width_surface ;
        int height_surfacea = height_surface ;

    if ( GlobalVariables.getPrint() != "false" ) {

        String mFrameCount = "1";
        saveScreenShot(0, 0, width_surfacea, height_surfacea, "/save/test.png");
        GlobalVariables.setPrint("false");


    }

}

    public void saveScreenShot(int x, int y, int w, int h, String filename) {

    Bitmap bmp = grabPixels(x, y, w, h);

    try {
        String path = Environment.getExternalStorageDirectory() + "/" + filename;

        File file = new File(path);
        file.createNewFile();

        FileOutputStream fos = new FileOutputStream(file);
        bmp.compress(CompressFormat.PNG, 100, fos);

        fos.flush();

        fos.close();

    } catch (Exception e) {
        e.printStackTrace();
    }
}

public Bitmap grabPixels(int x, int y, int w, int h) {

    int screenshotSize = w * h;

    ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 3);
    bb.order(ByteOrder.nativeOrder());
    bb.position(0);
    GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, bb);
    int pixelsBuffer[] = new int[screenshotSize];

    bb.asIntBuffer().get(pixelsBuffer);
    bb = null;

    Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.RGB_565);

    bitmap.setPixels(pixelsBuffer, screenshotSize-w, -w, 0, 0, w, h);

    pixelsBuffer = null;

    short sBuffer[] = new short[screenshotSize];
    ShortBuffer sb = ShortBuffer.wrap(sBuffer);
    bitmap.copyPixelsToBuffer(sb);

    //Making created bitmap (from OpenGL points) compatible with Android bitmap
    for (int i = 0; i < screenshotSize; ++i) {                  
        short v = sBuffer[i];
        sBuffer[i] = (short) (((v&0x1f) << 11) | (v&0x7e0) | ((v&0xf800) >> 11));
    }
    sb.rewind();
    bitmap.copyPixelsFromBuffer(sb);


    return bitmap;

}

3 个答案:

答案 0 :(得分:2)

您的阵列长度不匹配。 代码行:

 bb.asIntBuffer().get(pixelsBuffer);

导致异常,因为bb.asIntBuffer()的长度仅为(w * h * 3),而pixelsBuffer的长度为w * h * 4(当然我说的是字节)。 所以get()会抛出异常(根据文档) http://developer.android.com/reference/java/nio/IntBuffer.html#get%28int[]%29

更新

尝试此代码(警告:我甚至没有尝试编译它):

public Bitmap grabPixels(int x, int y, int w, int h) {

    // Allocate byte array of H*W size assuming that we retrieve only R,G and B values
    ByteBuffer byteBuffer = ByteBuffer.allocateDirect(screenshotSize * 3);
    byteBuffer.order(ByteOrder.nativeOrder());
    byteBuffer.position(0);
    // Read RGB values to our byte array
    GLES20.glReadPixels(x, y, w, h, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, byteBuffer);

    //Now we need to convert the 24bit RGB data to ARGB8888 format
    // I like arrays, they are honest :)
    byte[] pixelBuffer = byteBuffer.array();
    // This is an array that we going to use as backend for the bitmap
    int[] finalPixels = new int[w*h];
    // Let the magic flow
    int j = 0;
    for (int i = 0; i < pixelBuffer.length; i += 3) {
        byte red = pixelBuffer[i];
        byte green = pixelBuffer[i+1];
        byte blue = pixelBuffer[i+2];

        finalPixels[j++] = 0xFF000000 | ((int)blue << 16) | ((int)green << 8) | red; 
    }

    // Create a bitmap of ARGB_8888
    return Bitmap.createBitmap(finalPixels, w, h, Bitmap.Config.ARGB_8888);
}

答案 1 :(得分:2)

可以找到用于从MPEG文件中提取帧的示例代码on bigflake。视频帧使用GL进行渲染,saveFrame()函数使用glReadPixels()Bitmap将其作为PNG保存到磁盘。

您需要注意一些相当重要的性能缺陷;有关如何最好地使用NIO的一些注释,请参阅saveFrame()中的注释。您还要确保您的EGLConfig符合您的像素读取格式(如果源格式和目标格式不相同,glReadPixels()可能会变慢)。

更新:现在Grafika中的实施稍微更为一般 - 请参阅EglSurfaceBase#saveFrame()

答案 2 :(得分:1)

我认为ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 3);是错误的。 它应该是ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 4);