我正在学习AVCaptureSession以及如何使用其委托方法捕获多个图像
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
我的目标是以每秒预定义的速率捕获1个或多个图像。例如,每1秒1或2个图像。所以我设置了
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
captureOutput.minFrameDuration = CMTimeMake(1, 1);
启动[self.captureSession startRunning];
时,我的日志文件显示委托每秒被调用20次。它来自何处以及如何以我预期的间隔捕获图像?
答案 0 :(得分:10)
您可以使用下面给出的功能,如果您想以特定间隔捕获,则设置计时器并再次调用该功能。
-(IBAction)captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [stillImageOutput connections])
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
NSLog(@"About to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"Attachments: %@", exifAttachments);
}
else
{
NSLog(@"No attachments found.");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[self vImage] setImage:image];
}];
}
有关详情,请参阅iOS4: Take photos with live video preview using AVFoundation。
答案 1 :(得分:0)
我挣扎了一段时间的东西是拍摄照片时的大幅延迟(约5秒),并试图用拍摄的图像设置UIImage。在
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
方法,你不能使用[self.image setImage:img]
之类的常规函数来处理链接到UI的东西,你必须在主线程上运行它们,如下所示:
[self.image performSelectorOnMainThread:@selector(setImage:) withObject:img waitUntilDone:TRUE];
希望这有助于某人