花了这么多时间来获取OpenCV的关键点和描述符

时间:2015-10-26 09:06:26

标签: ios opencv image-processing surf brisk

我使用OpenCV Surf方法来获取关键点和描述符,它工作正常,但花了很多时间。

我的代码是: -

NSLog(@"Keypoint Detects");
//-- Step 1: Detect the keypoints using SURF Detector

int minHessian = 400;
SurfFeatureDetector detector( minHessian );
std::vector<KeyPoint> keypoints_object, keypoints_scene;
detector.detect( img_1, keypoints_object );
detector.detect( img_2, keypoints_scene );

//-- Step 2: Calculate descriptors (feature vectors)
NSLog(@"Descriptor Detects");
SurfDescriptorExtractor extractor;
Mat descriptors_object, descriptors_scene;
extractor.compute( img_1, keypoints_object, descriptors_object );
extractor.compute( img_2, keypoints_scene, descriptors_scene );

//-- Step 3: Matching descriptor vectors using FLANN matcher
NSLog(@"Matching Detects");
FlannBasedMatcher matcher;
std::vector< DMatch > matches;
matcher.match( descriptors_object, descriptors_scene, matches );

Xcode时间结果: -

2015-10-26 13:22:27.282 AVDemo [288:26112] Keypoint检测

2015-10-26 13:22:28.361 AVDemo [288:26112]描述符检测

2015-10-26 13:22:30.077 AVDemo [288:26112]匹配检测

这需要2秒钟来计算

我还使用了另一种方法来获取: -

NSLog(@"Detect Keypoints");
cv::Ptr<cv::BRISK> ptrBrisk = cv::BRISK::create();
ptrBrisk->detect(img_1, camkeypoints);

//for keypoints
NSLog(@"Compute Keypoints");
ptrBrisk->compute(img_1, camkeypoints,camdescriptors);
if(camdescriptors.type()!=CV_32F) {
    camdescriptors.convertTo(camdescriptors, CV_32F);
}
NSLog(@"camera image conversion end");

这也工作正常但有时间问题 Xcode结果: -

2015-10-26 14:19:47.939 AVDemo [305:32700]检测关键点

2015-10-26 14:19:49.787 AVDemo [305:32700]计算关键点

2015-10-26 14:19:49.818 AVDemo [305:32700]相机图像转换结束

如何最大限度地缩短这段时间?

现在我使用了FASTfeatureDetector,它可以最大限度地减少一些时间,但仍然需要SurfDescriptorExtractor。

新守则是: -

NSLog(@"Keypoint Detects");

//-- Step 1: Detect the keypoints using SURF Detector
int minHessian = 15;

FastFeatureDetector detector( minHessian );

std::vector<KeyPoint> keypoints_object, keypoints_scene;

detector.detect( img_1, keypoints_object );
detector.detect( img_2, keypoints_scene );

//-- Step 2: Calculate descriptors (feature vectors)
NSLog(@"Descriptor Detects");
SurfDescriptorExtractor extractor;

Mat descriptors_object, descriptors_scene;

extractor.compute( img_1, keypoints_object, descriptors_object );
extractor.compute( img_2, keypoints_scene, descriptors_scene );

//-- Step 3: Matching descriptor vectors using FLANN matcher
NSLog(@"Matching Detects");
FlannBasedMatcher matcher;
std::vector< DMatch > matches;
matcher.match( descriptors_object, descriptors_scene, matches );

Xcode: -

2015-10-26 16:06:19.018 AVDemo [375:47824]关键点检测

2015-10-26 16:06:19.067 AVDemo [375:47824]描述符检测

2015-10-26 16:06:21.117 AVDemo [375:47824]匹配检测

0 个答案:

没有答案