onPictureTaken中的位图被裁剪错误

时间:2014-11-17 23:11:19

标签: android bitmap crop face-detection

我想实现自己的面部检测/识别Android应用程序。当相机找到一些脸部时,相机预览上会显示一个矩形(实时)。 App也有拍摄照片的方法。但是,我不想保存整个图片,只想保存矩形内的区域 - 人脸。当我将矩形坐标赋予Bitmap.createBitmap方法来裁剪我的图片时,裁剪照片的正确性取决于显示矩形的位置。当检测到的面部出现在预览的中间时,createBitmap会将其裁剪为大约正常,但如果它显示在显示的左侧或右侧则不会。看起来像我发送到Bitmap.createBitmap的坐标被转换,但我找不到比率。有解决方案吗

  • 这是我的onPictureTaken方法:

     @Override
    public void onPictureTaken(byte[] data, Camera camera) {
    
        File pictureFile = getOutputMediaFile(); 
        if (pictureFile == null) {
            Log.d(TAG, "Error creating media file, check storage permissions: ");
            return;
        }
    
        Bitmap picture = BitmapFactory.decodeByteArray(data, 0, data.length);
        RectF faceRect = mPreview.getFaceRect();
    
    
        float x = faceRect.left;
        float y = faceRect.top;
        float w = faceRect.right - faceRect.left;
        float h = faceRect.bottom - faceRect.top;
    
        int intX = (int) x;
        int intY = (int) y;
        int intW = (int) w;
        int intH = (int) h;
    
        Bitmap croppedPicture = Bitmap.createBitmap(picture, intX, intY, intW, intH);       
        ByteArrayOutputStream stream = new ByteArrayOutputStream();
        croppedPicture.compress(Bitmap.CompressFormat.JPEG, 100, stream);
        byte[] byteArrayFromPicture = stream.toByteArray();
    
        try {
            FileOutputStream fos = new FileOutputStream(pictureFile); 
            fos.write(byteArrayFromPicture);
            //fos.write(data);
            fos.close();
        } catch (FileNotFoundException e) {
            Log.d(TAG, "File not found: " + e.getMessage());
        } catch (IOException e) {
            Log.d(TAG, "Error accessing file: " + e.getMessage());
        }
    }
    

以下是裁剪图片的一些示例,我没有足够的声誉来发布更多链接:

(抱歉制作图片的图片,我懒得实现与照片一起保存矩形)

1 个答案:

答案 0 :(得分:1)

解决

解决方案非常简单 - 由于使用正面相机,捕获的图像始终被反射,添加了两个if子句:

Bitmap picture = BitmapFactory.decodeByteArray(data, 0, data.length);
        RectF faceRect = mPreview.getFaceRect();
        Camera.Parameters parameters  = mCamera.getParameters();
        int picWidth = parameters.getPictureSize().width;

        int intX = 0;
        int intY = (int) faceRect.top;
        int intW = (int) (faceRect.right - faceRect.left);
        int intH = (int) (faceRect.bottom - faceRect.top);

        if(faceRect.left > picWidth / 2) {
            intX = (int) (faceRect.right - (faceRect.right - picWidth / 2) * 2);
        }
        else if(faceRect.left <= picWidth / 2) {
            intX = (int) (picWidth - faceRect.right);
        }

        Bitmap croppedPicture = Bitmap.createBitmap(picture, intX, intY, intW, intH);       
        ByteArrayOutputStream stream = new ByteArrayOutputStream();
        croppedPicture.compress(Bitmap.CompressFormat.JPEG, 100, stream);
        byte[] byteArrayFromPicture = stream.toByteArray();