我有相机预览图层,相机预设为1280x720。 在预览图层上方,我添加了一个带边框的方形UIView。
我的目标是从相机中裁剪图像。
从相机中提取数据的方法
-(CGImageRef)createImageFromBuffer:(CVImageBufferRef)buffer
left:(size_t)left
top:(size_t)top
width:(size_t)width
height:(size_t)height CF_RETURNS_RETAINED {
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(buffer);
size_t dataWidth = CVPixelBufferGetWidth(buffer);
size_t dataHeight = CVPixelBufferGetHeight(buffer);
if (left + width > dataWidth ||
top + height > dataHeight) {
[NSException raise:NSInvalidArgumentException format:@"Crop rectangle does not fit within image data."];
}
size_t newBytesPerRow = ((width*4+0xf)>>4)<<4;
CVPixelBufferLockBaseAddress(buffer,0);
int8_t *baseAddress = (int8_t *)CVPixelBufferGetBaseAddress(buffer);
size_t size = newBytesPerRow*height;
int8_t *bytes = (int8_t *)malloc(size * sizeof(int8_t));
if (newBytesPerRow == bytesPerRow) {
memcpy(bytes, baseAddress+top*bytesPerRow, size * sizeof(int8_t));
} else {
for (int y=0; y<height; y++) {
memcpy(bytes+y*newBytesPerRow,
baseAddress+left*4+(top+y)*bytesPerRow,
newBytesPerRow * sizeof(int8_t));
}
}
CVPixelBufferUnlockBaseAddress(buffer, 0);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(bytes,
width,
height,
8,
newBytesPerRow,
colorSpace,
kCGBitmapByteOrder32Little|
kCGImageAlphaNoneSkipFirst);
CGColorSpaceRelease(colorSpace);
CGImageRef result = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
free(bytes);
return result;
}
旋转图片的代码
- (CGImageRef)createRotatedImage:(CGImageRef)original degrees:(float)degrees CF_RETURNS_RETAINED {
if (degrees == 0.0f) {
CGImageRetain(original);
return original;
} else {
double radians = degrees * M_PI / 180;
#if TARGET_OS_EMBEDDED || TARGET_IPHONE_SIMULATOR
radians = -1 * radians;
#endif
size_t _width = CGImageGetWidth(original);
size_t _height = CGImageGetHeight(original);
CGRect imgRect = CGRectMake(0, 0, _width, _height);
CGAffineTransform __transform = CGAffineTransformMakeRotation(radians);
CGRect rotatedRect = CGRectApplyAffineTransform(imgRect, __transform);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL,
rotatedRect.size.width,
rotatedRect.size.height,
CGImageGetBitsPerComponent(original),
0,
colorSpace,
kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedFirst);
CGContextSetAllowsAntialiasing(context, FALSE);
CGContextSetInterpolationQuality(context, kCGInterpolationNone);
CGColorSpaceRelease(colorSpace);
CGContextTranslateCTM(context,
+(rotatedRect.size.width/2),
+(rotatedRect.size.height/2));
CGContextRotateCTM(context, radians);
CGContextDrawImage(context, CGRectMake(-imgRect.size.width/2,
-imgRect.size.height/2,
imgRect.size.width,
imgRect.size.height),
original);
CGImageRef rotatedImage = CGBitmapContextCreateImage(context);
CFRelease(context);
return rotatedImage;
}
}
提取数据:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(self.lastDecodeTime && [self.lastDecodeTime timeIntervalSinceNow]>-DECODE_LIMIT_TIME){
return;
}
if ( self.scannerDisabled)
return;
self.lastDecodeTime=[NSDate date];
CVImageBufferRef videoFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
CGFloat cameraFrameWidth = CVPixelBufferGetWidth(videoFrame);
CGFloat cameraFrameHeight = CVPixelBufferGetHeight(videoFrame);
CGPoint rectPoint = self.rectangleView.frame.origin;
rectPoint = [self.previewLayer convertPoint:rectPoint fromLayer:self.view.layer];
CGPoint cameraPoint = [self.previewLayer captureDevicePointOfInterestForPoint:rectPoint];
CGPoint matrixPoint = CGPointMake(cameraPoint.x*cameraFrameWidth,cameraPoint.x*cameraFrameHeight);
CGFloat D = self.rectangleView.frame.size.width*2.0;
CGRect matrixRect = CGRectMake(matrixPoint.x, matrixPoint.y, D, D);
CGImageRef videoFrameImage = [self createImageFromBuffer:videoFrame left:matrixRect.origin.x top:matrixRect.origin.y width:matrixRect.size.width height:matrixRect.size.height];
CGImageRef rotatedImage = [self createRotatedImage:videoFrameImage degrees:self.rotationDeg];
CGImageRelease(videoFrameImage);
...
...
...
}
进行调试我在左上角添加了一个小图像视图,以查看裁剪结果。 你可以看到我正确的方向但是有某种偏移。 我假设因为相机缓冲区是1280x720并且iphone屏幕有不同的外观所以有某种裁剪可能是我正在处理的偏移..
附上截图,您可以看到裁剪图像不居中
p.s这是输出设置
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:rgbOutputSettings];
有什么想法吗?
答案 0 :(得分:0)
试试这个从整个图像中获取裁剪的图像
[self.view resizableSnapshotViewFromRect:requiredRectToCrop afterScreenUpdates:YES withCapInsets:UIEdgeInsetsZero];