我以前有这个代码使用QTKit从Mac的iSight摄像头捕获单个图像:
- (NSError*)takePicture
{
BOOL success;
NSError* error;
captureSession = [QTCaptureSession new];
QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &error];
if (!success) { return error; }
QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];
success = [captureSession addInput: captureDeviceInput error: &error];
if (!success) { return error; }
QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new];
[captureVideoOutput setDelegate: self];
success = [captureSession addOutput: captureVideoOutput error: &error];
if (!success) { return error; }
[captureSession startRunning];
return nil;
}
- (void)captureOutput: (QTCaptureOutput*)captureOutput
didOutputVideoFrame: (CVImageBufferRef)imageBuffer
withSampleBuffer: (QTSampleBuffer*)sampleBuffer
fromConnection: (QTCaptureConnection*)connection
{
CVBufferRetain(imageBuffer);
if (imageBuffer) {
[captureSession removeOutput: captureOutput];
[captureSession stopRunning];
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
_done = YES;
}
}
但是,我今天发现QTKit已被弃用,因此我们现在必须使用AVFoundation。 任何人都可以帮助我将此代码转换为AVFoundation等效代码吗?似乎很多方法都有相同的名称,但与此同时,很多方法都有所不同,我在这里完全失去了...有什么帮助吗?
答案 0 :(得分:10)
好吧,我找到了解决方案!!这是:
- (void)takePicture
{
NSError* error;
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];
if (!input) {
_error = error;
_done = YES;
return;
}
AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new];
[output setOutputSettings: @{(id)kCVPixelBufferPixelFormatTypeKey: @(k32BGRAPixelFormat)}];
captureSession = [AVCaptureSession new];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession addInput: input];
[captureSession addOutput: output];
[captureSession startRunning];
AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) {
if (error) {
_error = error;
_result = nil;
}
else {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
}
}
_done = YES;
}];
}
我希望这可以帮助那些在做同样事情时遇到任何问题。