我正在研究将前置摄像头视频输入显示到类似于FaceTime的UIView中。我知道这可以使用AVCaptureVideoPreviewLayer轻松完成。是否有另一种方法可以在不使用AVCaptureVideoPreviewLayer的情况下执行此操作?
这仅用于教育目的。
更新: 我发现这可以通过UIImagePickerController
来完成UIImagePickerController *cameraView = [[UIImagePickerController alloc] init];
cameraView.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraView.showsCameraControls = NO;
[self.view addSubview:cameraView.view];
[cameraView viewWillAppear:YES];
[cameraView viewDidAppear:YES];
答案 0 :(得分:3)
如果您尝试操作像素,可以将以下方法放在要指定为AVCaptureVideoDataOutputSampleBufferDelegate的委托的类中:
-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
if(CVPixelBufferLockBaseAddress(pb, 0)) //zero is success
NSLog(@"Error");
size_t bufferHeight = CVPixelBufferGetHeight(pb);
size_t bufferWidth = CVPixelBufferGetWidth(pb);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pb);
unsigned char* rowBase= CVPixelBufferGetBaseAddress(pb);
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
if (colorSpace == NULL)
NSLog(@"Error");
// Create a bitmap graphics context with the sample buffer data.
CGContextRef context= CGBitmapContextCreate(rowBase,bufferWidth,bufferHeight, 8,bytesPerRow, colorSpace, kCGImageAlphaNone);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
UIImage *currentImage=[UIImage imageWithCGImage:quartzImage];
// Free up the context and color space
CFRelease(quartzImage);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
if(CVPixelBufferUnlockBaseAddress(pb, 0 )) //zero is success
NSLog(@"Error");
}
然后将该图像连接到View控制器中的UIImageView。 查看kCGImageAlphaNone标志。这取决于你在做什么。