使用模式识别算法SURF获取异常,在OpenCV中使用SRO进行ANDROID

时间:2012-03-23 09:21:28

标签: android opencv sift surf

我想实现一个简单的应用程序(sample2的修改),它显示了SIFT,SURF,BRIEF和ORB的功能。用户可以简单地比较旋转或比例不变性或速度。但我发现失败了,我无法处理,所以我求助于你。 当我尝试使用SIFT或SURF时,当我尝试匹配时,我总是在线获取异常:matcherBruteForce.match(descriptorFrame, matches);

我有一个类似的AR应用程序,并且使用这些设置它正在工作,所以我无法弄清楚我犯错误的地方。我试图将变量“matcherBruteForce”设置为BRUTEFORCE,BRUTEFORCE_L1,BRUTEFORCE_SL2事件到BRUTEFORCE_HAMMING。但我总是有同样的例外:

SIFT:

CvException [org.opencv.core.CvException: /home/andreyk/OpenCV2/trunk/opencv_2.3.1.b2/modules/features2d/include/opencv2/features2d/features2d.hpp:2455: error: (-215) DataType<ValueType>::type == matcher.trainDescCollection[iIdx].type() || matcher.trainDescCollection[iIdx].empty() in function static void cv::BruteForceMatcher<Distance>::commonKnnMatchImpl(cv::BruteForceMatcher<Distance>&, const cv::Mat&, std::vector<std::vector<cv::DMatch> >&, int, const std::vector<cv::Mat>&, bool) [with Distance = cv::SL2<float>]
]

SURF:

CvException [org.opencv.core.CvException: /home/andreyk/OpenCV2/trunk/opencv_2.3.1.b2/modules/features2d/include/opencv2/features2d/features2d.hpp:2455: error: (-215) DataType<ValueType>::type == matcher.trainDescCollection[iIdx].type() || matcher.trainDescCollection[iIdx].empty() in function static void cv::BruteForceMatcher<Distance>::commonKnnMatchImpl(cv::BruteForceMatcher<Distance>&, const cv::Mat&, std::vector<std::vector<cv::DMatch> >&, int, const std::vector<cv::Mat>&, bool) [with Distance = cv::SL2<float>]
]

任何帮助表示赞赏

全班:

    package sk.bolyos.opencv;

import java.util.Vector;

import org.opencv.features2d.DMatch;
import org.opencv.features2d.DescriptorExtractor;
import org.opencv.features2d.DescriptorMatcher;
import org.opencv.features2d.FeatureDetector;
import org.opencv.features2d.Features2d;
import org.opencv.features2d.KeyPoint;
import org.opencv.highgui.VideoCapture;
import org.opencv.android.Utils;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import org.opencv.highgui.Highgui;

import sk.bolyos.svk.*;

import android.content.Context;
import android.graphics.Bitmap;
import android.util.Log;
import android.view.SurfaceHolder;




public class MyView extends CvViewBase {

    private static final int BOUNDARY = 35;

    private Mat mRgba;
    private Mat mGray;
    private Mat mIntermediateMat;
    private Mat mLogoMilka1,mLogoMilka2,mLogoMilka3,mLogoMilka4;
    ///////////////////DETECTORS
    FeatureDetector siftDetector = FeatureDetector.create(FeatureDetector.SIFT);
    FeatureDetector surfDetector = FeatureDetector.create(FeatureDetector.SURF);
    FeatureDetector fastDetector = FeatureDetector.create(FeatureDetector.FAST);
    FeatureDetector orbDetector = FeatureDetector.create(FeatureDetector.ORB);
    ///////////////////DESCRIPTORS
    DescriptorExtractor siftDescriptor = DescriptorExtractor.create(DescriptorExtractor.SIFT);
    DescriptorExtractor surfDescriptor = DescriptorExtractor.create(DescriptorExtractor.SURF);
    DescriptorExtractor briefDescriptor = DescriptorExtractor.create(DescriptorExtractor.BRIEF);
    DescriptorExtractor orbDescriptor = DescriptorExtractor.create(DescriptorExtractor.ORB);
    ///////////////////DATABASE
    Vector<KeyPoint> vectorMilka1 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka2 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka3 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka4 = new Vector<KeyPoint>();
    Mat descriptorMilka1 = new Mat();
    Mat descriptorMilka2 = new Mat();
    Mat descriptorMilka3 = new Mat(); 
    Mat descriptorMilka4 = new Mat();
    ///////////////////VIDEO
    Vector<KeyPoint> vectorFrame = new Vector<KeyPoint>();
    Mat descriptorFrame = new Mat();

    DescriptorMatcher matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
    DescriptorMatcher matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2);
    Vector<DMatch> matches = new Vector<DMatch>();
    Vector<Mat> siftDescriptors = new Vector<Mat>();
    Vector<Mat> surfDescriptors = new Vector<Mat>();
    Vector<Mat> briefDescriptors = new Vector<Mat>();
    Vector<Mat> orbDescriptors = new Vector<Mat>();

    public MyView(Context context) {
        super(context);
        // TODO Auto-generated constructor stub
        try{
            /*
            if (mLogoMilka1 == null){
                mLogoMilka1 = new Mat();
                mLogoMilka1 = Utils.loadResource(getContext(), R.drawable.milkalogo);
                fillDB(mLogoMilka1,vectorMilka1,descriptorMilka1);
            }
            if (mLogoMilka2 == null){
                mLogoMilka2 = new Mat();
                mLogoMilka2 = Utils.loadResource(getContext(), R.drawable.milkalogom);
                fillDB(mLogoMilka2,vectorMilka2,descriptorMilka2);
            }
            if (mLogoMilka3 == null){
                mLogoMilka3 = new Mat();
                mLogoMilka3 = Utils.loadResource(getContext(), R.drawable.milkalogol);
                fillDB(mLogoMilka3,vectorMilka3,descriptorMilka3);
            }*/
            if (mLogoMilka4 == null){
                mLogoMilka4 = new Mat();
                mLogoMilka4 = Utils.loadResource(getContext(), R.drawable.milkalogolc);
                fillDB(mLogoMilka4,vectorMilka4,descriptorMilka4);
            }

        }catch(Exception e){
            Log.e( "SVK APPLICATION", "in MyView constructor "+e.toString());
        }
    }

    public void fillDB(Mat mLogo,Vector<KeyPoint> vector,Mat descriptor){

      //SIFT 
        siftDetector.detect( mLogo, vector );
        siftDescriptor.compute(mLogo, vector, descriptor);
        siftDescriptors.add(descriptor);
      //SURF 
        surfDetector.detect( mLogo, vector );
        surfDescriptor.compute(mLogo, vector, descriptor);
        surfDescriptors.add(descriptor);
      //FAST+BRIEF 
        fastDetector.detect( mLogo, vector );
        briefDescriptor.compute(mLogo, vector, descriptor);
        briefDescriptors.add(descriptor);
      //ORB 
        orbDetector.detect( mLogo, vector );
        orbDescriptor.compute(mLogo, vector, descriptor);
        orbDescriptors.add(descriptor);

    }


    @Override
    public void surfaceChanged(SurfaceHolder _holder, int format, int width, int height) {
        super.surfaceChanged(_holder, format, width, height);

        synchronized (this) {
            // initialize Mats before usage
            mGray = new Mat();
            mRgba = new Mat();
            mIntermediateMat = new Mat();
            matches = new Vector<DMatch>();
            vectorFrame = new Vector<KeyPoint>();
            descriptorFrame = new Mat(); 
        }
    }

    @Override
    protected Bitmap processFrame(VideoCapture capture) {
        // TODO Auto-generated method stub
        switch (SVKApplikaciaActivity.viewMode) {
        case SVKApplikaciaActivity.VIEW_MODE_SIFT:
            //TODO SIFT
            try{
                //matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
                //matcherBruteForce.clear();
                matcherBruteForce.add(siftDescriptors);
                matcherBruteForce.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                siftDetector.detect( mGray, vectorFrame );
                siftDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherBruteForce.match(descriptorFrame, matches);  
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in SIFT "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_SURF:
            //TODO SURF
            try{
                //matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
                //matcherBruteForce.clear();
                matcherBruteForce.add(surfDescriptors);
                matcherBruteForce.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                surfDetector.detect( mGray, vectorFrame );
                surfDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherBruteForce.match(descriptorFrame, matches);  
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in Surf "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_BRIEF:
            //TODO BRIEF
            try{
                matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
                matcherHamming.add(briefDescriptors);
                matcherHamming.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                fastDetector.detect( mGray, vectorFrame );
                briefDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherHamming.match(descriptorFrame, matches); 
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in Brief "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_ORB:
            //TODO ORB
            try{
                matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
                matcherHamming.add(orbDescriptors);
                matcherHamming.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                orbDetector.detect( mGray, vectorFrame );
                orbDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherHamming.match(descriptorFrame, matches); 
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
                }catch(Exception e){
                    Log.e( "SVK APPLICATION","in ORB "+ e.toString());
                }
            break;  
        case SVKApplikaciaActivity.VIEW_MODE_AR:
            //TODO AR
            break;    

        }

        Bitmap bmp = Bitmap.createBitmap(mRgba.cols(), mRgba.rows(), Bitmap.Config.ARGB_8888);

        if (Utils.matToBitmap(mRgba, bmp))
            return bmp;

        bmp.recycle();

        return null;
    }

    @Override
    public void run() {
        super.run();

        synchronized (this) {
            // Explicitly deallocate Mats
            if (mRgba != null)
                mRgba.release();
            if (mGray != null)
                mGray.release();
            if (mIntermediateMat != null)
                mIntermediateMat.release();

            mRgba = null;
            mGray = null;
            mIntermediateMat = null;
        }
    }

}

2 个答案:

答案 0 :(得分:3)

我想我知道这个问题。您正在使用的匹配器无法应用于SIFT和SURF描述符。如果您必须使用带有筛选或冲浪的DescriptorMatcher,则必须将其设置为

DescriptorMatcher matcherBruteForce=DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2);

由于SURF和SIFT仅接受FloatBased描述符,如果您将DescriptorMatcher设置为HAMMING,它将返回错误。

请注意,在您的代码中,您有两个DescriptorMatchers,一个设置为BRUTEFORCE.SL2,另一个设置为HAMMING。确保您传递正确的一个,即BRUTEFORCE.SL2到SIFT或SURF。

然而,最好将基于FLANN的匹配器用于SIFT或SURF,因为与ORB相比,它们提取了更多的关键点,而FLANN适用于大量关键点在此处阅读更多相关信息  http://computer-vision-talks.com/2011/07/comparison-of-the-opencvs-feature-detection-algorithms-ii/

http://opencv.willowgarage.com/documentation/cpp/flann_fast_approximate_nearest_neighbor_search.html

更新: 可以使用L2或L1距离来匹配uchar描述符。如果您将DescriptorMatcher设置为BRUTEFORCE,它也可能适用于ORB(尽管效果不佳)

答案 1 :(得分:1)

你确定你的vectorFrame的大小不等于零吗? 我认为我有同样的问题..你的问题将在检测算法中,我认为当你的图像的颜色代码不正确时它会返回零矢量帧

Log.e( "SVK APPLICATION","vectorFrame size = "+ vectorFrame.size());放在某处