从YV12或NV12字节数组裁剪图像

时间:2014-01-13 15:04:56

标签: android image-processing crop

我实现了Camera.PreviewCallback,然后我将原始图像(YV12或NV12格式)作为字节数组。我正在寻找一种方法来裁剪该图像的一部分而不将其转换为位图。图像的裁剪部分将在其他位置流式传输(再次作为字节数组)。

任何帮助表示感谢。

公共类CameraAccess实现了Camera.PreviewCallback,         LoaderCallbackInterface {

private byte[] lastFrame;

@Override
public void onPreviewFrame(byte[] frame, Camera arg1) {
    synchronized(this) {
       this.lastFrame = frame;

    }
}

@Override
public byte[] cropFrame(Integer x, Integer y, Integer width, Integer height) {
    synchronized(this) {
       // how to crop directly from byte array?

    }
}

}

2 个答案:

答案 0 :(得分:4)

和图像作为字节数组只是图像中巨大数组的每个像素。它从左上角的像素开始,然后向右下行,然后向下行(在左侧)。

所以要裁剪它只需要将你想要的像素复制到一个带有for循环的新字节数组:

Rect cropArea = ... //the are to crop
int currentPos = 0;
byte[] croppedOutput = new byte[cropArea.width() * cropArea.height()];
for(int y = 0; y < height; y++){
  for(int x = 0; x < width; x++){
      // here you compare if x and y are within the crop area you want
    if(cropArea.contains(x, y)){
       croppedOutput[currentPos] = frame[positionInArrayForXY(x, y)]
    }
  } 
}

你必须为方法positionInArrayForXY做一些额外的数学计算,这几乎是x * y但是当值为零和东西时必须考虑。

ps。:我相信每帧的帧数是1个字节,但不确定,所以如果它是每个像素2个字节,那就有一些额外的数学。但这个想法是一样的,你可以从中发展。

修改

回答你的评论: 不,这个东西没有标题,它只是直接的像素。这就是为什么它总是给你相机信息,所以你可以知道尺寸。

当我回答我希望YUV遵循RGB的数组顺序时,我肯定不会回答我的答案。

我做了一些额外的研究,here你可以看到进行YUV到RGB转换的方法,如果仔细检查,你会注意到它每12位使用一次,即1.5字节=&gt; ; 921600 * 1.5 = 1382400

基于此,我可以想到几个方法:

  • (最容易实现)将你的帧转换为RGB(我知道你指定你不想要,但它会更容易)并根据我的回答进行裁剪然后流式传输。
  • (最大的开销,根本不容易)如果流的接收器必须在YUV中接收,请执行上述操作,但在流之前将其转换回YUV,执行链接方法的反转操作。
  • 非常难以实现,但根据您的原始问题解决)根据我的示例代码,我发布的链接上的代码以及每像素需要12位的事实带有2个for循环的代码来完成裁剪。

答案 1 :(得分:1)

有人要求我提供最终解决方案和一些源代码。所以这就是我的所作所为。

场景:我的项目基于运行Android的片上系统。我为通过USB连接到主板的本地相机实现了相机处理。该相机可用作Android智能手机上的相机。第二个是基于IP的相机在网络上流式传输图像。因此,软件设计可能看起来有点令人困惑。随意提问。

解决方案:因为OpenCV处理,相机初始化以及颜色和位图转换是一件非常棘手的事情,所以我最终将所有内容封装到两个类中,从而在我的Android代码中多次避免使用愚蠢的代码。

第一个类处理颜色/位图和OpenCV矩阵转换。它被定义为:

import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.core.Mat;    
import android.graphics.Bitmap;

public interface CameraFrame extends CvCameraViewFrame {
    Bitmap toBitmap();

    @Override
    Mat rgba();

    @Override
    Mat gray();
}

所有颜色和位图转换都在此接口的实现范围内。实际的转换是由OpenCV for Android附带的工具完成的。你会看到我只使用一个Bitmap。这是因为资源节省和位图转换是CPU密集型的。所有UI组件都显示/呈现此位图。只有当任何组件请求位图时才进行转换。

private class CameraAccessFrame implements CameraFrame {
    private Mat mYuvFrameData;
    private Mat mRgba;
    private int mWidth;
    private int mHeight;
    private Bitmap mCachedBitmap;
    private boolean mRgbaConverted;
    private boolean mBitmapConverted;

    @Override
    public Mat gray() {
        return mYuvFrameData.submat(0, mHeight, 0, mWidth);
    }

    @Override
    public Mat rgba() {
        if (!mRgbaConverted) {
            Imgproc.cvtColor(mYuvFrameData, mRgba,
                    Imgproc.COLOR_YUV2BGR_NV12, 4);
            mRgbaConverted = true;
        }
        return mRgba;
    }

    // @Override
    // public Mat yuv() {
    // return mYuvFrameData;
    // }

    @Override
    public synchronized Bitmap toBitmap() {
        if (mBitmapConverted)
            return mCachedBitmap;

        Mat rgba = this.rgba();
        Utils.matToBitmap(rgba, mCachedBitmap);

        mBitmapConverted = true;
        return mCachedBitmap;
    }

    public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
        super();
        mWidth = width;
        mHeight = height;
        mYuvFrameData = Yuv420sp;
        mRgba = new Mat();

        this.mCachedBitmap = Bitmap.createBitmap(width, height,
                Bitmap.Config.ARGB_8888);
    }

    public synchronized void put(byte[] frame) {
        mYuvFrameData.put(0, 0, frame);
        invalidate();
    }

    public void release() {
        mRgba.release();
        mCachedBitmap.recycle();
    }

    public void invalidate() {
        mRgbaConverted = false;
        mBitmapConverted = false;
    }
};

摄像机处理封装在两个特殊类中,稍后将对此进行说明。一个( HardwareCamera 实现 ICamera )处理相机初始化和关闭,而第二个( CameraAccess )处理OpenCV初始化和其他组件的通知( CameraCanvasView 扩展CanvasView实现 CameraFrameCallback ),它们有兴趣接收相机图像并在Android视图(UI)中显示它们。这些组件必须连接(注册)到该类。

回调(由任何UI组件实现)定义如下:

public interface CameraFrameCallback {
    void onCameraInitialized(int frameWidth, int frameHeight);

    void onFrameReceived(CameraFrame frame);

    void onCameraReleased();
}

此接口的实现由以下UI组件完成:

import android.content.Context;
import android.util.AttributeSet;
import android.view.SurfaceHolder;
import CameraFrameCallback;

public class CameraCanvasView extends CanvasView implements CameraFrameCallback {

    private CameraAccess mCamera;
    private int cameraWidth = -1;
    private int cameraHeight = -1;
    private boolean automaticReceive;
    private boolean acceptNextFrame;

    public CameraCanvasView(Context context, AttributeSet attributeSet) {
        super(context, attributeSet);
    }

    public CameraAccess getCamera() {
        return mCamera;
    }

    public boolean getAcceptNextFrame() {
        return acceptNextFrame;
    }

    public void setAcceptNextFrame(boolean value) {
        this.acceptNextFrame = value;
    }

    public void setCamera(CameraAccess camera, boolean automaticReceive) {
        if (camera == null)
            throw new NullPointerException("camera");

        this.mCamera = camera;
        this.mCamera.setAutomaticReceive(automaticReceive);
        this.automaticReceive = automaticReceive;
    }

    @Override
    public void onCameraInitialized(int frameWidth, int frameHeight) {
        cameraWidth = frameWidth;
        cameraHeight = frameHeight;

        setCameraBounds();
    }

    public void setCameraBounds() {

        int width = 0;
        int height = 0;
        if (fixedWidth > 0 && fixedHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        } else if (cameraWidth > 0 && cameraHeight > 0) {
            width = fixedWidth;
            height = fixedHeight;
        }

        if (width > 0 && height > 0)
            super.setCameraBounds(width, height, true);
    }

    @Override
    public void onFrameReceived(CameraFrame frame) {
        if (acceptNextFrame || automaticReceive)
            super.setBackground(frame);

        // reset
        acceptNextFrame = false;
    }

    @Override
    public void onCameraReleased() {

        setBackgroundImage(null);
    }

    @Override
    public void surfaceCreated(SurfaceHolder arg0) {
        super.surfaceCreated(arg0);

        if (mCamera != null) {
            mCamera.addCallback(this);

            if (!automaticReceive)
                mCamera.receive(); // we want to get the initial frame
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder arg0) {
        super.surfaceDestroyed(arg0);

        if (mCamera != null)
            mCamera.removeCallback(this);
    }
}

该UI组件可以在XML布局中使用,如下所示:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical" >

    <eu.level12.graphics.laser.CameraCanvasView
        android:id="@+id/my_camera_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        />

</LinearLayout>

底层CanvasView负责将相机图像/位图绘制到Android UI表面,这是另一个棘手的事情,因此被封装。对不起,我不能在这里添加完整的解决方案,因为这将是太多的代码。

无论如何,让我们回到相机处理。 UI组件和摄像头之间的链接由 CameraAccess 类完成,该类也在应用程序启动时加载OpenCV。

import java.util.ArrayList;
import java.util.List;

import org.opencv.android.InstallCallbackInterface;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;

import android.content.Context;
import android.content.SharedPreferences;
import android.content.SharedPreferences.OnSharedPreferenceChangeListener;
import android.graphics.Rect;
import android.preference.PreferenceManager;
import android.text.TextUtils;
import android.util.Log;

public final class CameraAccess implements OnSharedPreferenceChangeListener,
        LoaderCallbackInterface {

    public static final int CAMERA_INDEX_IP = Integer.MAX_VALUE;
    private static final int CAM_NONE = -1;
    private static final int CAM_DEFAULT = 0;
    private static final String DEFAULT_IP = "127.0.0.1";

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private final static String TAG = "CameraAccess";
    private Context context;
    private int cameraIndex;
    private String cameraURI;
    private List<CameraFrameCallback> mCallbacks = new ArrayList<CameraFrameCallback>();
    private List<IOpenCVLoadedCallback> mLoadedCallbacks = new ArrayList<IOpenCVLoadedCallback>();
    private SharedPreferences preferences;
    private ICamera camera;
    private int mFrameWidth;
    private int mFrameHeight;
    private boolean mOpenCVloaded;
    private boolean isFixed;
    private boolean isDirty;
    private final Rect roi = new Rect();
    private final ManualResetEvent automaticReceive = new ManualResetEvent(true);
    private final AutoResetEvent doReceive = new AutoResetEvent(true);

    private static CameraAccess mInstance;

    public static CameraAccess getInstance(Context context) {

        if (mInstance != null) {
            if (mInstance.isDirty) {
                if (!mInstance.isFixed) {
                    mInstance.releaseCamera();
                    mInstance.connectCamera();
                }

                mInstance.isDirty = false;
            }

            return mInstance;
        }

        mInstance = new CameraAccess(context);

        mInstance.isFixed = false;
        mInstance.connectCamera();

        return mInstance;
    }

    public static CameraAccess getIPCamera(Context context, String uri) {
        if (mInstance != null
                && Utils.as(NetworkCamera.class, mInstance) == null)
            throw new IllegalStateException(
                    "Camera already initialized as non-network/IP.");

        if (mInstance != null)
            return mInstance;

        mInstance = new CameraAccess(context);
        mInstance.connectIPCamera(uri);
        mInstance.isFixed = true;

        return mInstance;
    }

    private CameraAccess(Context context) {

        this.context = context;
        this.preferences = PreferenceManager
                .getDefaultSharedPreferences(context);
        this.preferences.registerOnSharedPreferenceChangeListener(this);
        this.cameraIndex = getCameraIndex();

        if (!OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_7, context,
                this)) {
            Log.e(TAG, "Cannot connect to OpenCVManager");
        } else
            Log.d(TAG, "OpenCVManager successfully connected");
    }

    public Context getContext() {
        return context;
    }

    public boolean isOpenCVLoaded() {
        return mOpenCVloaded;
    }

    @Override
    public void onManagerConnected(int status) {
        mOpenCVloaded = true;

        notifyOpenCVLoadedCallbacks();

        if (mCallbacks.size() > 0 && camera != null)
            camera.connect();
    }

    @Override
    public void onPackageInstall(int operation,
            InstallCallbackInterface callback) {
    }

    @Override
    public void onSharedPreferenceChanged(SharedPreferences sharedPreferences,
            String key) {

        String cameraSelectKey = context
                .getString(R.string.settings_select_camera_key);
        String cameraIPKey = context
                .getString(R.string.settings_camera_ip_address_key);

        if (key.equals(cameraIPKey) || key.equals(cameraSelectKey)) {
            this.preferences = sharedPreferences;
            this.cameraIndex = getCameraIndex();

            this.isDirty = true;
        }

    }

    private int getCameraIndex() {
        if (preferences == null || context == null)
            return CAM_NONE;

        String index = preferences.getString(
                context.getString(R.string.settings_select_camera_key), ""
                        + CAM_DEFAULT);

        this.cameraURI = preferences.getString(
                context.getString(R.string.settings_camera_ip_address_key),
                DEFAULT_IP);

        int intIndex;
        try {
            intIndex = Integer.parseInt(index);
            return intIndex;
        } catch (NumberFormatException ex) {
            Log.e(TAG, "Could not parse camera index: " + ex.getMessage());
            return CAM_NONE;
        }
    }

    public synchronized void addCallback(CameraFrameCallback callback) {

        if (callback == null) {
            Log.e(TAG, "Camera frame callback not added because it is null.");
            return;
        }

        // we don't care if the callback is already in the list
        this.mCallbacks.add(callback);

        Log.d(TAG, String.format("Camera frame callback added: %s (count: %d)",
                callback.getClass().getName(), this.mCallbacks.size()));

        if (camera != null) {
            if (camera.isConnected())
                callback.onCameraInitialized(mFrameWidth, mFrameHeight);
            else
                camera.connect();
        }
    }

    public synchronized void removeCallback(CameraFrameCallback callback) {

        synchronized (this) {
            if (callback == null) {
                Log.e(TAG,
                        "Camera frame callback not removed because it is null.");
                return;
            }

            boolean removed = false;
            do {
                // someone might have added the callback multiple times
                removed = this.mCallbacks.remove(callback);

                if (removed) {
                    callback.onCameraReleased();

                    Log.d(TAG, String.format(
                            "Camera frame callback removed: %s (count: %d)",
                            callback.getClass().getName(),
                            this.mCallbacks.size()));
                }

            } while (removed == true);
        }

        if (mCallbacks.size() == 0)
            releaseCamera();
    }

    public synchronized void addOpenCVLoadedCallback(
            IOpenCVLoadedCallback callback) {

        if (callback == null) {
            return;
        }

        if (mOpenCVloaded) {
            callback.onOpenCVLoaded();
            return;
        }

        // we don't care if the callback is already in the list
        this.mLoadedCallbacks.add(callback);
    }

    // private synchronized void removeOpenCvCallback(
    // IOpenCVLoadedCallback callback) {
    //
    // if (callback == null)
    // return;
    //
    // boolean removed = false;
    // do {
    // // someone might have added the callback multiple times
    // removed = this.mLoadedCallbacks.remove(callback);
    //
    // } while (removed == true);
    // }

    private synchronized void notifyOpenCVLoadedCallbacks() {
        if (!mOpenCVloaded)
            return;

        for (IOpenCVLoadedCallback callback : mLoadedCallbacks)
            callback.onOpenCVLoaded();

        mLoadedCallbacks.clear();
    }

    public boolean isAutomaticReceive() {
        return automaticReceive.isSet();
    }

    public void setAutomaticReceive(boolean automatic) {
        if (automatic)
            automaticReceive.set();
        else
            automaticReceive.reset();
    }

    public boolean hasRegionOfInterest() {
        return !this.roi.isEmpty() && camera != null
                && camera.supportsRegionOfInterest();
    }

    public Rect getRegionOfInterest() {
        return this.roi;
    }

    public void setRegionOfInterest(Rect roi) {
        if (roi == null)
            this.roi.set(0, 0, 0, 0);
        else
            this.roi.set(roi);
    }

    public void receive() {
        doReceive.set();
    }

    public boolean waitForReceive(long milliseconds) {
        try {
            return doReceive.waitOne(milliseconds);
        } catch (InterruptedException e) {
            return false;
        }
    }

    private void connectCamera() {
        Log.d(TAG, "connect to camera " + cameraIndex);
        if (cameraIndex == CAMERA_INDEX_IP) {
            connectIPCamera(null);
        } else {
            connectLocalCamera();
        }
    }

    private void connectLocalCamera() {
        camera = new HardwareCamera(context, this, cameraIndex);
    }

    private void connectIPCamera(String uri) {

        if (TextUtils.isEmpty(uri))
            uri = cameraURI;

        if (TextUtils.isEmpty(uri))
            throw new NullPointerException(
                    "No URI (IP) for the remote network camera specified.");

        // camera = new NetworkCameraOpenCV(this, uri);
        camera = new NetworkCameraCached(this, uri);
        // camera = new NetworkCamera(this, uri);
        Log.d(TAG, "Connected to network camera: " + uri);
    }

    private synchronized void releaseCamera() {

        if (camera != null) {
            camera.release();

            for (CameraFrameCallback callback : mCallbacks)
                callback.onCameraReleased();
        }
    }

    public synchronized void onPreviewFrame(CameraFrame frame) {
        for (CameraFrameCallback callback : mCallbacks) {
            callback.onFrameReceived(frame);
        }
    }

    public synchronized void onCameraInitialized(int width, int height) {
        this.mFrameWidth = width;
        this.mFrameHeight = height;

        for (CameraFrameCallback callback : mCallbacks) {
            callback.onCameraInitialized(width, height);
        }
    }

    public interface CameraFrameCallback {
        void onCameraInitialized(int frameWidth, int frameHeight);

        void onFrameReceived(CameraFrame frame);

        void onCameraReleased();
    }

    public interface IOpenCVLoadedCallback {
        void onOpenCVLoaded();
    }

    public interface ICamera {

        boolean supportsRegionOfInterest();

        void connect();

        void release();

        boolean isConnected();
    }
}

本地连接的相机(适用于Android智能手机)的实施由 HardwareCamera 类完成。会员用户可被视为在相机和所有UI组件之间进行调解的图片的消费者

import java.io.IOException;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;

import org.opencv.android.Utils;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.Size;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;

public class HardwareCamera implements CameraAccess.ICamera,
        Camera.PreviewCallback {

    // see http://developer.android.com/guide/topics/media/camera.html for more
    // details

    private static final boolean USE_THREAD = true;

    private final static String TAG = "HardwareCamera";
    // private final Context context;
    private final int cameraIndex; // example: CameraInfo.CAMERA_FACING_FRONT or
                                    // -1 for
    // IP_CAM
    private final CameraAccess user;
    private Camera mCamera;
    private int mFrameWidth;
    private int mFrameHeight;
    private CameraAccessFrame mCameraFrame;
    private CameraHandlerThread mThread = null;
    private SurfaceTexture texture = new SurfaceTexture(0);

    // needed to avoid OpenCV error:
    // "queueBuffer: BufferQueue has been abandoned!"
    private byte[] mBuffer;

    public HardwareCamera(Context context, CameraAccess user, int cameraIndex) {
        // this.context = context;
        this.cameraIndex = cameraIndex;
        this.user = user;
    }

    // private boolean checkCameraHardware() {
    // if (context.getPackageManager().hasSystemFeature(
    // PackageManager.FEATURE_CAMERA)) {
    // // this device has a camera
    // return true;
    // } else {
    // // no camera on this device
    // return false;
    // }
    // }

    public static Camera getCameraInstance(int facing) {

        Camera c = null;
        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
        int cameraCount = Camera.getNumberOfCameras();
        int index = -1;

        for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
            Camera.getCameraInfo(camIdx, cameraInfo);
            if (cameraInfo.facing == facing) {
                try {
                    c = Camera.open(camIdx);
                    index = camIdx;
                    break;
                } catch (RuntimeException e) {
                    Log.e(TAG,
                            String.format(
                                    "Camera is not available (in use or does not exist). Facing: %s Index: %s Error: %s",
                                    facing, camIdx, e.getMessage()));

                    continue;
                }
            }
        }

        if (c != null)
            Log.d(TAG, String.format("Camera opened. Facing: %s Index: %s",
                    facing, index));
        else
            Log.e(TAG, "Could not find any camera matching facing: " + facing);

        // returns null if camera is unavailable
        return c;
    }

    private synchronized void connectLocalCamera() {
        if (!user.isOpenCVLoaded())
            return;

        if (USE_THREAD) {
            if (mThread == null) {
                mThread = new CameraHandlerThread(this);
            }

            synchronized (mThread) {
                mThread.openCamera();
            }
        } else {
            oldConnectCamera();
        }

        user.onCameraInitialized(mFrameWidth, mFrameHeight);
    }

    private/* synchronized */void oldConnectCamera() {
        // synchronized (this) {
        if (true) {// checkCameraHardware()) {
            mCamera = getCameraInstance(cameraIndex);
            if (mCamera == null)
                return;

            Parameters params = mCamera.getParameters();
            List<Camera.Size> sizes = params.getSupportedPreviewSizes();

            // Camera.Size previewSize = sizes.get(0);
            Collections.sort(sizes, new PreviewSizeComparer());
            Camera.Size previewSize = null;
            for (Camera.Size s : sizes) {
                if (s == null)
                    break;

                previewSize = s;
            }

            // List<Integer> formats = params.getSupportedPictureFormats();
            // params.setPreviewFormat(ImageFormat.NV21);

            params.setPreviewSize(previewSize.width, previewSize.height);
            mCamera.setParameters(params);

            params = mCamera.getParameters();

            mFrameWidth = params.getPreviewSize().width;
            mFrameHeight = params.getPreviewSize().height;

            int size = mFrameWidth * mFrameHeight;
            size = size
                    * ImageFormat.getBitsPerPixel(params.getPreviewFormat())
                    / 8;

            this.mBuffer = new byte[size];
            Log.d(TAG, "Created callback buffer of size (bytes): " + size);

            Mat mFrame = new Mat(mFrameHeight + (mFrameHeight / 2),
                    mFrameWidth, CvType.CV_8UC1);
            mCameraFrame = new CameraAccessFrame(mFrame, mFrameWidth,
                    mFrameHeight);

            if (this.texture != null)
                this.texture.release();

            this.texture = new SurfaceTexture(0);

            try {
                mCamera.setPreviewTexture(texture);
                mCamera.addCallbackBuffer(mBuffer);
                mCamera.setPreviewCallbackWithBuffer(this);
                mCamera.startPreview();

                Log.d(TAG,
                        String.format(
                                "Camera preview started with %sx%s. Rendering to SurfaceTexture dummy while receiving preview frames.",
                                mFrameWidth, mFrameHeight));
            } catch (Exception e) {
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
        // }
    }

    @Override
    public synchronized void onPreviewFrame(byte[] frame, Camera arg1) {
        mCameraFrame.put(frame);

        if (user.isAutomaticReceive() || user.waitForReceive(500))
            user.onPreviewFrame(mCameraFrame);

        if (mCamera != null)
            mCamera.addCallbackBuffer(mBuffer);
    }

    private class CameraAccessFrame implements CameraFrame {
        private Mat mYuvFrameData;
        private Mat mRgba;
        private int mWidth;
        private int mHeight;
        private Bitmap mCachedBitmap;
        private boolean mRgbaConverted;
        private boolean mBitmapConverted;

        @Override
        public Mat gray() {
            return mYuvFrameData.submat(0, mHeight, 0, mWidth);
        }

        @Override
        public Mat rgba() {
            if (!mRgbaConverted) {
                Imgproc.cvtColor(mYuvFrameData, mRgba,
                        Imgproc.COLOR_YUV2BGR_NV12, 4);
                mRgbaConverted = true;
            }
            return mRgba;
        }

        // @Override
        // public Mat yuv() {
        // return mYuvFrameData;
        // }

        @Override
        public synchronized Bitmap toBitmap() {
            if (mBitmapConverted)
                return mCachedBitmap;

            Mat rgba = this.rgba();
            Utils.matToBitmap(rgba, mCachedBitmap);

            mBitmapConverted = true;
            return mCachedBitmap;
        }

        public CameraAccessFrame(Mat Yuv420sp, int width, int height) {
            super();
            mWidth = width;
            mHeight = height;
            mYuvFrameData = Yuv420sp;
            mRgba = new Mat();

            this.mCachedBitmap = Bitmap.createBitmap(width, height,
                    Bitmap.Config.ARGB_8888);
        }

        public synchronized void put(byte[] frame) {
            mYuvFrameData.put(0, 0, frame);
            invalidate();
        }

        public void release() {
            mRgba.release();
            mCachedBitmap.recycle();
        }

        public void invalidate() {
            mRgbaConverted = false;
            mBitmapConverted = false;
        }
    };

    private class PreviewSizeComparer implements Comparator<Camera.Size> {
        @Override
        public int compare(Size arg0, Size arg1) {
            if (arg0 != null && arg1 == null)
                return -1;
            if (arg0 == null && arg1 != null)
                return 1;

            if (arg0.width < arg1.width)
                return -1;
            else if (arg0.width > arg1.width)
                return 1;
            else
                return 0;
        }
    }

    private static class CameraHandlerThread extends HandlerThread {
        Handler mHandler;
        HardwareCamera owner;

        CameraHandlerThread(HardwareCamera owner) {
            super("CameraHandlerThread");

            this.owner = owner;

            start();
            mHandler = new Handler(getLooper());
        }

        synchronized void notifyCameraOpened() {
            notify();
        }

        void openCamera() {
            mHandler.post(new Runnable() {
                @Override
                public void run() {
                    owner.oldConnectCamera();
                    notifyCameraOpened();
                }
            });

            try {
                wait();
            } catch (InterruptedException e) {
                Log.w(TAG, "wait was interrupted");
            }
        }
    }

    @Override
    public boolean supportsRegionOfInterest() {
        return false;
    }

    @Override
    public void connect() {
        connectLocalCamera();
    }

    @Override
    public void release() {
        synchronized (this) {

            if (USE_THREAD) {
                if (mThread != null) {
                    mThread.interrupt();
                    mThread = null;
                }
            }

            if (mCamera != null) {
                mCamera.stopPreview();
                mCamera.setPreviewCallback(null);
                try {
                    mCamera.setPreviewTexture(null);
                } catch (IOException e) {
                    Log.e(TAG, "Could not release preview-texture from camera.");
                }

                mCamera.release();

                Log.d(TAG, "Preview stopped and camera released");
            }
            mCamera = null;

            if (mCameraFrame != null) {
                mCameraFrame.release();
            }

            if (texture != null)
                texture.release();
        }
    }

    @Override
    public boolean isConnected() {
        return mCamera != null;
    }
}

最后一步是将它们连接在一起。这是在 onResume 方法中实施活动后完成的。

@Override
protected void onResume() {
    super.onResume();

    if (fourPointView != null) {
        cameraAccess = CameraAccess.getInstance(this);
        canvasView.setCamera(cameraAccess, true);
    } else {
        cameraAccess = null;
    }

    if (cameraAccess != null)
        cameraAccess.setAutomaticReceive(true);

    if (cameraAccess != null && fourPointView != null)
        cameraAccess.setRegionOfInterest(RectTools.toRect(canvasView
                .getCamera().getViewport()));
}

@Override
protected void onPause() {
    super.onPause();

    if (cameraAccess != null)
        cameraAccess.setRegionOfInterest(null);
}

备注:我知道这不是一个完整的实施,但我希望你明白这一点。最有趣的部分是颜色转换,这可以在这篇文章的顶部找到。