Android + OpenCV +人脸检测+自定义布局

时间:2012-07-14 21:17:04

标签: android opencv

我正在使用:

  • Android 4.0.3
  • OpenCV 2.4.2
  • 三星Galaxy S2

面部检测示例(来自opencv 2.4.2)运行正常。 但是现在,我想创建一个自定义布局,实际上只使用从面部检测中提取的数据并在其上构建游戏。不一定让FdView表面占据整个屏幕。

我在下面做了这些修改,但只显示黑屏。屏幕上没有任何内容。

添加了fd.xml布局:

<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="horizontal">

    <org.opencv.samples.fd.FdView android:id="@+id/FdView" 
        android:layout_width="640dp" 
        android:layout_height="480dp"
        android:visibility="visible"
        />

    <TextView
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:textColor="#FF0000"
        android:text="hi"/>

修改了FdActivity.java的baseLoaderCallback:

    private BaseLoaderCallback  mOpenCVCallBack = new BaseLoaderCallback(this) {
    @Override
    public void onManagerConnected(int status) {
        switch (status) {
            case LoaderCallbackInterface.SUCCESS:
            {
                Log.i(TAG, "OpenCV loaded successfully");

                // Load native libs after OpenCV initialization
                System.loadLibrary("detection_based_tracker");

                //EXPERIMENT
                setContentView(R.layout.fd);
                FdView surface = (FdView) (findViewById(R.id.FdView));

                surface = mView;
                // Create and set View
                mView = new FdView(mAppContext);
                mView.setDetectorType(mDetectorType);
                mView.setMinFaceSize(0.2f);
                //setContentView(mView);


                // Check native OpenCV camera
                if( !mView.openCamera() ) {
                    AlertDialog ad = new AlertDialog.Builder(mAppContext).create();
                    ad.setCancelable(false); // This blocks the 'BACK' button
                    ad.setMessage("Fatal error: can't open camera!");
                    ad.setButton("OK", new DialogInterface.OnClickListener() {
                        public void onClick(DialogInterface dialog, int which) {
                        dialog.dismiss();
                        finish();
                        }
                    });
                    ad.show();
                }
            } break;
            default:
            {
                super.onManagerConnected(status);
            } break;
        }
    }
};

在FdView.java中添加了构造函数:

    public FdView(Context context, AttributeSet attrs, int defStyle) {
    super(context, attrs, defStyle);
    // TODO Auto-generated constructor stub
}

public FdView(Context context, AttributeSet attrs) {
    super(context, attrs);
    // TODO Auto-generated constructor stub
}

在SampleCvViewBase.java中添加了构造函数:

    public SampleCvViewBase(Context context, AttributeSet attrs, int defStyle) {
    super(context, attrs, defStyle);
    // TODO Auto-generated constructor stub
}

public SampleCvViewBase(Context context, AttributeSet attrs) {
    super(context, attrs);
    // TODO Auto-generated constructor stub
}

4 个答案:

答案 0 :(得分:1)

我有同样的问题。也想弄清楚。我正在尝试在SurfaceView上显示不占整个屏幕的图像。有了这些,我读到你不能拥有你的Camera处理程序类并且在不同的类中链接SurfaceView。所以把所有东西都拼成了一个。

所以,目前我将相机显示在SurfaceView上,并将帧数据复制到mFrame变量。基本上只是努力获得mFrame处理(在多线程,Run()函数中)并在SurfaceView上显示结果。

这是我的代码,如果您认为它会有所帮助:(原谅格式,因为我的代码也是正在进行的工作)

package org.opencv.samples.tutorial3;

import java.io.IOException;
import java.util.List;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;

import android.app.Activity;
import android.app.AlertDialog;
import android.content.Context;
import android.content.DialogInterface;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.ImageFormat;
import android.graphics.Paint;
import android.graphics.Rect;
import android.graphics.RectF;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.Window;
import android.widget.TextView;

public class Sample3Native extends Activity implements SurfaceHolder.Callback,Runnable{


    //Camera variables
    private Camera              cam;
    private boolean             previewing = false;
    private SurfaceHolder       mHolder;
    private SurfaceView         mViewer;
    private int                 mFrameWidth;
    private int                 mFrameHeight;
    private byte[]              mFrame;
    private boolean             mThreadRun;
    private byte[]              mBuffer;
    Sample3View viewclass;
    TextView text;
    int value = 0;
    //==========

    int framecount = 0;

    private BaseLoaderCallback  mOpenCVCallBack = new BaseLoaderCallback(this) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS:
                {

                    // Load native library after(!) OpenCV initialization
                    System.loadLibrary("native_sample");

                    //constructor for viewclass that works on frames
                    viewclass = new Sample3View();

                    //setContentView(mView);
                    //OpenCam();
                    //setContentView(R.layout.main);

                    // Create and set View
                    CameraConstruct();
                    Camopen();

                } break;
                default:
                {
                    super.onManagerConnected(status);
                } break;
            }
        }
    };

    public Sample3Native()
    {}

    @Override
    public void onCreate(Bundle savedInstanceState) 
    {
        super.onCreate(savedInstanceState);
        requestWindowFeature(Window.FEATURE_NO_TITLE);

        setContentView(R.layout.main);

        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_2, this, mOpenCVCallBack);
    }

    //Camera construction
    public void CameraConstruct()
    {
        mViewer = (SurfaceView)findViewById(R.id.camera_view);
        text = (TextView)findViewById(R.id.text);
        mHolder = mViewer.getHolder();
        mHolder.addCallback(this);
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }


    //calls camera screen setup when screen surface changes
    public void surfaceChanged(SurfaceHolder holder, int format, int width,int height) 
    {
        CamStartDisplay();  
    }

    public void Camclose()
    {
        if(cam != null && previewing)
        {
            cam.setPreviewCallback(null);
            cam.stopPreview();
            cam.release();
            cam = null;

            previewing = false;
        }

        mThreadRun = false;
        viewclass.PreviewStopped();
    }

    //only open camera, and get frame data
    public void Camopen()
    {
        if(!previewing){
            cam = Camera.open();
            //rotate display
            cam.setDisplayOrientation(90);
            if (cam != null)
            {
                //copy viewed frame
                cam.setPreviewCallbackWithBuffer(new PreviewCallback() 
                {

                    public void onPreviewFrame(byte[] data, Camera camera) 
                    {
                        synchronized (this) 
                        {
                            System.arraycopy(data, 0, mFrame, 0, data.length);

                            this.notify(); 
                        }
                        //text.setText(Integer.toString(value++));
                        camera.addCallbackBuffer(mBuffer);
                    }
                });

            }

        }//if not previewing
    }

    //start preview
    public void CamStartDisplay()
    {
        synchronized (this) 
        {
            if(cam != null)
            {
                //stop previewing till after settings is changed
                if(previewing == true)
                {
                    cam.stopPreview();
                    previewing = false;
                }

                Camera.Parameters p = cam.getParameters();
                for(Camera.Size s : p.getSupportedPreviewSizes())
                {
                    p.setPreviewSize(s.width, s.height);
                    mFrameWidth = s.width;
                    mFrameHeight = s.height;
                    break;
                }

                p.setPreviewSize(mFrameWidth, mFrameHeight);

                List<String> FocusModes = p.getSupportedFocusModes();
                if (FocusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO))
                {
                    p.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
                }
                cam.setParameters(p);

                //set the width and height for processing
                viewclass.setFrame(mFrameWidth, mFrameHeight);

                int size = mFrameWidth*mFrameHeight;
                size  = size * ImageFormat.getBitsPerPixel(p.getPreviewFormat()) / 8;
                mBuffer = new byte[size];
                mFrame = new byte [size];
                cam.addCallbackBuffer(mBuffer);

                viewclass.PreviewStarted(mFrameWidth, mFrameHeight);

                //start display streaming
                try 
                {
                    //cam.setPreviewDisplay(null);
                    cam.setPreviewDisplay(mHolder);
                    cam.startPreview();
                    previewing = true;
                } 
                catch (IOException e) 
                {
                    e.printStackTrace();
                }

            }//end of if cam != null
        }//synchronising
    }

    //thread gets started when the screen surface is created
    public void surfaceCreated(SurfaceHolder holder) {
        //Camopen();
        //CamStartDisplay();
        (new Thread(this)).start(); 
    }

    //called when the screen surface is stopped
    public void surfaceDestroyed(SurfaceHolder holder) 
    {
        Camclose();
    }

    //this is function that is run by thread
    public void run() 
    {

        mThreadRun = true;
        while (mThreadRun) 
        {
            //text.setText(Integer.toString(value++));
            Bitmap bmp = null;

            synchronized (this) 
            {
                try 
                {
                    this.wait();

                    bmp = viewclass.processFrame(mFrame);
                } 
                catch (InterruptedException e) {}
            }

            if (bmp != null) 
            {
                Canvas canvas = mHolder.lockCanvas();

                if (canvas != null) 
                {
                    canvas.drawBitmap(bmp, (canvas.getWidth() - mFrameWidth) / 2, (canvas.getHeight() - mFrameHeight) / 2, null);
                    mHolder.unlockCanvasAndPost(canvas);
                }
            }//if bmp != null
        }// while thread in run
    }


}//end class
此类中使用的

Sample3View只包含processFrame函数:

package org.opencv.samples.tutorial3;

import android.content.Context;
import android.graphics.Bitmap;
import android.widget.TextView;

class Sample3View {

    private int mFrameSize;
    private Bitmap mBitmap;
    private int[] mRGBA;

    private int frameWidth;
    private int frameHeight;
    private int count = 0;

    Sample3Native samp;

    //constructor
    public Sample3View() 
    {
    }

    public void setFrame(int width,int height)
    {
        frameWidth = width;
        frameHeight = height;
    }

    public void PreviewStarted(int previewWidtd, int previewHeight) {
        mFrameSize = previewWidtd * previewHeight;
        mRGBA = new int[mFrameSize];
        mBitmap = Bitmap.createBitmap(previewWidtd, previewHeight, Bitmap.Config.ARGB_8888);
    }

    public void PreviewStopped() {
        if(mBitmap != null) {
            mBitmap.recycle();
            mBitmap = null;
        }
        mRGBA = null;   
    }

    public Bitmap processFrame(byte[] data) {
        int[] rgba = mRGBA;

        FindFeatures(frameWidth, frameHeight, data, rgba);

        Bitmap bmp = mBitmap; 
        bmp.setPixels(rgba, 0, frameWidth, 0, 0, frameWidth, frameHeight);


        //samp.setValue(count++);
        return bmp;
    }

    public native void FindFeatures(int width, int height, byte yuv[], int[] rgba);
}

所以是的,希望这会有所帮助。如果我得到完整的解决方案,我也会发布。如果你先得到解决方案,请发布你的东西!享受

答案 1 :(得分:0)

还没有真正的答案(还),还尝试在opencv 2.4.2中制作自定义布局

我有2.4.0这个完美工作的解决方案如果我记得它就足以添加教师..但它不适用于2.4.2

我会试着想出来,让你们知道。

答案 2 :(得分:0)

我遇到了同样的问题,我想使用布局创建自定义视图。 OpenCV 2.4.2似乎没有提供这个功能。 OpenCV 2.4.3有这个功能,但它的教程没有这么说(它使用了OpenCV2.4.2中的旧例子)。它的Android示例提供了一些见解。最后,我在OpenCV 2.4.9 documentation找到了说明。

希望它有所帮助。

答案 3 :(得分:0)

哈,我想出了一个办法。您可以简单地将OpenCV Loader和自定义布局分开。

定义BaseLoaderCallback mOpenCVCallBack。

    private BaseLoaderCallback mOpenCVCallBack = new BaseLoaderCallback(this) {

    @Override
    public void onManagerConnected(int status) {
        switch (status) {
        case LoaderCallbackInterface.SUCCESS: {
            Log.i(TAG, "OpenCV loaded successfully");

            // Load native library after(!) OpenCV initialization
            System.loadLibrary("native_sample");    
        }
            break;
        default: {
            super.onManagerConnected(status);
        }
            break;
        }
    }
};

在OnCreat中,构建自定义布局,加载OpenCv Loader,

    public void onCreate(Bundle savedInstanceState) {
    Log.i(TAG, "onCreate");
    super.onCreate(savedInstanceState);
    requestWindowFeature(Window.FEATURE_NO_TITLE);

    // /////////////////////////////////////////////////////////////////////
    // // begin:
    // // Create and set View
    setContentView(R.layout.main);
    mView = (Sample3View) findViewById(R.id.sample3view);

    mcameraButton = (ImageView) findViewById(R.id.cameraButton);

    if (!OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_2, this, mOpenCVCallBack)) {
        Log.e(TAG, "Cannot connect to OpenCV Manager");
    }

}

就是这样! 我做到了,而且效果很好。