我正在尝试将脸部检测添加到我的应用程序中,我添加的代码为我提供了一个与脸部无关的CGRect。
这是代码
CIImage *cIImage = [CIImage imageWithCGImage:self.imageView.image.CGImage];
CIDetector* faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary
dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray *features = [faceDetector featuresInImage:cIImage];
for(CIFaceFeature* faceObject in features)
{
FaceLocation.x = faceObject.bounds.origin.x;
FaceLocation.y = faceObject.bounds.origin.y;
}// Here face location is far off away form the actual face
但是这段代码给了我一个远离实际面孔的位置,我在这里做错了什么?
答案 0 :(得分:2)
问题来自UIImage和CIDetectorImageOrientation中的方向之间的差异。来自iOS文档:
CIDetectorImageOrientation
用于指定要检测其要素的图像的显示方向的键。这把钥匙 是一个NSNumber对象,其值与TIFF和 EXIF规格;值的范围可以是1到8.该值 指定图像的原点(0,0)所在的位置。如果不 目前,默认值为1,表示图像的原点 是左上角。有关每个值指定的图像原点的详细信息, 请参阅kCGImagePropertyOrientation。
适用于iOS 5.0及更高版本。
在CIDetector.h中声明。
您必须指定CIDetectorImageOrientation。这是我做的:
int exifOrientation;
switch (self.image.imageOrientation) {
case UIImageOrientationUp:
exifOrientation = 1;
break;
case UIImageOrientationDown:
exifOrientation = 3;
break;
case UIImageOrientationLeft:
exifOrientation = 8;
break;
case UIImageOrientationRight:
exifOrientation = 6;
break;
case UIImageOrientationUpMirrored:
exifOrientation = 2;
break;
case UIImageOrientationDownMirrored:
exifOrientation = 4;
break;
case UIImageOrientationLeftMirrored:
exifOrientation = 5;
break;
case UIImageOrientationRightMirrored:
exifOrientation = 7;
break;
default:
break;
}
NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
NSArray *features = [faceDetector featuresInImage:[CIImage imageWithCGImage:self.image.CGImage]
options:@{CIDetectorImageOrientation:[NSNumber numberWithInt:exifOrientation]}];
检测到特征后,您还需要将坐标映射到uiimage视图,在此处使用我的要点:https://gist.github.com/laoyang/5747004转换坐标系
答案 1 :(得分:0)