我有这个应用程序的记录视频,我需要每次抓取一个帧时触发一个方法。在我的头撞到墙上之后,我决定尝试以下操作:创建一个调度队列,因为我会从输出中获取一个视频,只是为了在摄像机记录帧时调用一个方法。
我正在尝试了解Apple创建的一段代码,用于录制视频以了解我应该如何添加调度队列。这是苹果代码,星号之间标记的部分是我添加的部分,以便创建队列。它编译时没有错误,但 captureOutput:didOutputSampleBuffer:fromConnection:永远不会被调用。
- (BOOL) setupSessionWithPreset:(NSString *)sessionPreset error:(NSError **)error
{
BOOL success = NO;
// Init the device inputs
AVCaptureDeviceInput *videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:error] autorelease];
[self setVideoInput:videoInput]; // stash this for later use if we need to switch cameras
AVCaptureDeviceInput *audioInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:error] autorelease];
[self setAudioInput:audioInput];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[self setMovieFileOutput:movieFileOutput];
[movieFileOutput release];
// Setup and start the capture session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([session canAddInput:videoInput]) {
[session addInput:videoInput];
}
if ([session canAddInput:audioInput]) {
[session addInput:audioInput];
}
if ([session canAddOutput:movieFileOutput]) {
[session addOutput:movieFileOutput];
}
[session setSessionPreset:sessionPreset];
// I added this *****************
dispatch_queue_t queue = dispatch_queue_create("myqueue", NULL);
[[self videoDataOutput] setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// ******************** end of my code
[session startRunning];
[self setSession:session];
[session release];
success = YES;
return success;
}
我需要的只是一种可以处理正在记录的每一帧的方法。
感谢
答案 0 :(得分:2)
将自己设为代表后,您将收到以下电话:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
每次捕获新帧时。你可以在那里放置你想要的任何代码 - 只是要小心,因为你不会在主线程上。在[target performSelectorOnMainThread:@selector(methodYouActuallyWant)]
中进行快速-captureOutput:didOutputSampleBuffer:fromConnection:
可能是最安全的。
另外:我在我的代码中使用以下设置,并成功导致调用委托方法。我无法看到它与您正在使用的内容之间存在任何实质性差异。
- (id)initWithSessionPreset:(NSString *)sessionPreset delegate:(id <AAVideoSourceDelegate>)aDelegate
{
#ifndef TARGET_OS_EMBEDDED
return nil;
#else
if(self = [super init])
{
delegate = aDelegate;
NSError *error = nil;
// create a low-quality capture session
session = [[AVCaptureSession alloc] init];
session.sessionPreset = sessionPreset;
// grab a suitable device...
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// ...and a device input
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(!input || error)
{
[self release];
return nil;
}
[session addInput:input];
// create a VideDataOutput to route output to us
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:[output autorelease]];
// create a suitable dispatch queue, GCD style, and hook self up as the delegate
dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// set 32bpp BGRA pixel format, since I'll want to make sense of the frame
output.videoSettings =
[NSDictionary
dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
}
return self;
#endif
}
- (void)start
{
[session startRunning];
}
- (void)stop
{
[session stopRunning];
}
答案 1 :(得分:0)
// create a suitable dispatch queue, GCD style, and hook self up as the delegate
dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
也非常重要
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
一定要放一个
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];在开始时,最后一个[池排水]会在太多的过程后崩溃。