在其中使用带有相机预览的SurfaceView的ScreenShot

时间:2014-10-10 07:11:59

标签: android bitmap screenshot surfaceview

我正在尝试实现一种功能,包括录制视频时拍摄照片。这就是我结束使用SurfaceView的截图方法的原因 但是,当我尝试使用SurfaceView的屏幕截图时。我总是得到一张空白图片。

以下是我用于拍摄快照的代码:

View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....

如果您认为这是一个重复的问题,请允许我向您保证,在询问此问题之前,我已尝试在SO上针对相同问题提供以下解决方案。

  1. https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
  2. Facing issue to take a screenshot while recording a video
  3. Take camera screenshot while recording - Like in Galaxy S3?
  4. Taking screen shot of a SurfaceView in android
  5. Get screenshot of surfaceView in Android(这是正确答案,但部分回答。我已经要求@sajar解释答案)
  6. 互联网上的其他资源:
    1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project 2. http://www.phonesdevelopers.com/1795894/

    这对我来说迄今为止都没有奏效。我也知道我们需要创建一些与Surface Holder交互并从中获取位图的Thread。但我不知道如何实现这一点。

    任何帮助都得到高度赞赏。

2 个答案:

答案 0 :(得分:6)

这是另一个:Take screenshot of SurfaceView

SurfaceViews有一个“表面”部分和一个“视图”部分;您的代码尝试捕获“视图”部分。 “表面”部分是一个单独的层,并没有简单的“抓取所有像素”方法。基本的困难是你的应用程序位于表面的“生产者”一侧,而不是“消费者”方面,因此读取像素是有问题的。请注意,底层缓冲区采用的数据生成器最方便,因此对于相机预览,它将是YUV缓冲区。

“捕获”表面像素的最简单,最有效的方法是将它们绘制两次,一次用于屏幕,一次用于捕获。如果使用OpenGL ES执行此操作,则YUV到RGB转换可能由硬件模块完成,这比在YUV缓冲区中接收相机帧并进行自己的转换要快得多。

Grafika's“来自相机的纹理”活动演示了使用GLES操纵输入的视频数据。渲染后,您可以使用glReadPixels()获取像素。 glReadPixels()的性能在不同设备和不同用例之间会有很大差异。 EglSurfaceBase#saveFrame()显示如何捕获到Bitmap并保存为PNG。

有关Android图形架构的更多信息,特别是SurfaceView曲面的生产者 - 消费者性质,可以在this document中找到。

答案 1 :(得分:-1)

public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback  {

static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;

PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.camera);

    surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
    surfaceHolder = surfaceView.getHolder();
    Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
    imgScreen = (ImageView)findViewById(R.id.imgScreen);



    btnTakeScreen.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            Bitmap screen = Bitmap.createBitmap(getBitmap());
            imgScreen.setImageBitmap(screen);
        }
    });


    // Install a SurfaceHolder.Callback so we get notified when the
    // underlying surface is created and destroyed.
    surfaceHolder.addCallback(this);

    // deprecated setting, but required on Android versions prior to 3.0
    surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

    jpegCallback = new PictureCallback() {
        @SuppressLint("WrongConstant")
        public void onPictureTaken(byte[] data, Camera camera) {
            FileOutputStream outStream = null;
            try {
                outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
                outStream.write(data);
                outStream.close();
                Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
            }
            Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
            refreshCamera();
        }
    };
}






public void refreshCamera() {
    if (surfaceHolder.getSurface() == null) {
        // preview surface does not exist
        return;
    }

    // stop preview before making changes
    try {
        camera.stopPreview();
    } catch (Exception e) {
        // ignore: tried to stop a non-existent preview
    }

    // set preview size and make any resize, rotate or
    // reformatting changes here
    // start preview with new settings
    try {
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {

    }
}

public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
    // Now that the size is known, set up the camera parameters and begin
    // the preview.
    refreshCamera();
}

public void surfaceCreated(SurfaceHolder holder) {


    if (camera == null) {
        try {
            camera = Camera.open();
        } catch (RuntimeException ignored) {
        }
    }

    try {
        if (camera != null) {
            WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
            camera.setPreviewDisplay(surfaceHolder);
        }
    } catch (Exception e) {
        if (camera != null)
            camera.release();
        camera = null;
    }

    if (camera == null) {
        return;
    } else {
        camera.setPreviewCallback(new Camera.PreviewCallback() {
            @Override
            public void onPreviewFrame(byte[] bytes, Camera camera) {
                if (param == null) {
                    return;
                }
                byteArray = bytes;
            }
        });
    }





    param = camera.getParameters();
    mPreviewSize = param.getSupportedPreviewSizes().get(0);

    param.setColorEffect(Camera.Parameters.EFFECT_NONE);

    //set antibanding to none
    if (param.getAntibanding() != null) {
        param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
    }

    // set white ballance
    if (param.getWhiteBalance() != null) {
        param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
    }

    //set flash
    if (param.getFlashMode() != null) {
        param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
    }

    //set zoom
    if (param.isZoomSupported()) {
        param.setZoom(0);
    }

    //set focus mode
    param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);


    // modify parameter
    camera.setParameters(param);
    try {
        // The Surface has been created, now tell the camera where to draw
        // the preview.
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {
        // check for exceptions
        System.err.println(e);
        return;
    }
}

public void surfaceDestroyed(SurfaceHolder holder) {
    // stop preview and release camera
    camera.stopPreview();
    camera.release();
    camera = null;
}



public Bitmap getBitmap() {
    try {
        if (param == null)
            return null;

        if (mPreviewSize == null)
            return null;

        int format = param.getPreviewFormat();
        YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();

        Log.i("myLog","array: "+byteArray.toString());



        Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);

        yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inPurgeable = true;
        options.inInputShareable = true;
        mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);

        byteArrayOutputStream.flush();
        byteArrayOutputStream.close();
    } catch (IOException ioe) {
        ioe.printStackTrace();
    }

    return mBitmap;
}