如何将GLSurfaceView保存到纹理/缓冲区/位图

时间:2012-12-26 17:11:59

标签: java android opengl-es bitmap

我正在使用GLSurfaceView开发Android应用程序。有一刻我必须用它的图像替换我的GLSurfaceView。问题是,如何获得正确的图像?我用了这段代码:

    v.setDrawingCacheEnabled(true);
    v.measure(View.MeasureSpec.makeMeasureSpec(500, View.MeasureSpec.AT_MOST),
            View.MeasureSpec.makeMeasureSpec(500, View.MeasureSpec.AT_MOST));
    v.layout(0, 0, v.getMeasuredWidth(), v.getMeasuredHeight());

    v.buildDrawingCache(true);
    Bitmap b = Bitmap.createBitmap(v.getDrawingCache());
    v.setDrawingCacheEnabled(false); // clear drawing cache
    return b;

但它总是返回黑色位图。

也许我们可以制作除Bitmap之外的东西(也可以放到GLSurfaceView)?

2 个答案:

答案 0 :(得分:1)

我不认为它以GLSurfaceView的方式工作。帧缓冲可能存在于GPU内部,它无法在CPU上直接访问。

您可以使用framebuffer object将图像渲染为纹理,然后使用glReadPixels将数据下载到缓冲区并将缓冲区转换为Bitmap

答案 1 :(得分:0)

将GLSurfaceView保存到位图。它的工作正确。

MyRenderer Class :

@Override
public void onDrawFrame(GL10 gl) {


try {
    int w = width_surface ;
    int h = height_surface  ;

    Log.i("hari", "w:"+w+"-----h:"+h);

    int b[]=new int[(int) (w*h)];
    int bt[]=new int[(int) (w*h)];
    IntBuffer buffer=IntBuffer.wrap(b);
    buffer.position(0);
    GLES20.glReadPixels(0, 0, w, h,GLES20.GL_RGBA,GLES20.GL_UNSIGNED_BYTE, buffer);
    for(int i=0; i<h; i++)
    {
         //remember, that OpenGL bitmap is incompatible with Android bitmap
         //and so, some correction need.        
         for(int j=0; j<w; j++)
         {
              int pix=b[i*w+j];
              int pb=(pix>>16)&0xff;
              int pr=(pix<<16)&0x00ff0000;
              int pix1=(pix&0xff00ff00) | pr | pb;
              bt[(h-i-1)*w+j]=pix1;
         }
    }           
    Bitmap inBitmap = null ;
    if (inBitmap == null || !inBitmap.isMutable()
         || inBitmap.getWidth() != w || inBitmap.getHeight() != h) {
        inBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
    }
    //Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
    inBitmap.copyPixelsFromBuffer(buffer);
    //return inBitmap ;
    // return Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
    inBitmap = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);

    ByteArrayOutputStream bos = new ByteArrayOutputStream(); 
    inBitmap.compress(CompressFormat.JPEG, 90, bos); 
    byte[] bitmapdata = bos.toByteArray();
    ByteArrayInputStream fis = new ByteArrayInputStream(bitmapdata);

    final Calendar c=Calendar.getInstance();
     long mytimestamp=c.getTimeInMillis();
    String timeStamp=String.valueOf(mytimestamp);
    String myfile="hari"+timeStamp+".jpeg";

    dir_image=new File(Environment.getExternalStorageDirectory()+File.separator+
             "printerscreenshots"+File.separator+"image");
    dir_image.mkdirs();

    try {
        File tmpFile = new File(dir_image,myfile); 
        FileOutputStream fos = new FileOutputStream(tmpFile);

        byte[] buf = new byte[1024];
        int len;
        while ((len = fis.read(buf)) > 0) {
            fos.write(buf, 0, len);
        }
        fis.close();
        fos.close();
    } catch (FileNotFoundException e) {
        e.printStackTrace();
    } catch (IOException e) {
        e.printStackTrace();
    }

       Log.v("hari", "screenshots:"+dir_image.toString());

    }
}catch(Exception e) {
    e.printStackTrace() ;
}