我想降低iPhone 4S上视频设备的帧速率,以便不那么频繁地调用didOutputSampleBuffer委托。这是为了提高性能,因为我处理每个帧并需要一个大框架来处理细节。
我在设置AVSession时尝试使用以下命令:
AVCaptureConnection *conn = [self.output connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoMinFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
[conn setVideoMaxFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
但这没有效果,我可以将CAPTURE_FRAMES_PER_SECOND从1更改为60,并且看不出性能差异或视频捕获速度减慢。为什么这没效果?如何降低视频设备的捕获帧速率?
我使用以下代码设置会话:
// Define the devices and the session and the settings
self.session = [[AVCaptureSession alloc] init];
//self.session.sessionPreset = AVCaptureSessionPresetPhoto;
//self.session.sessionPreset = AVCaptureSessionPresetHigh;
self.session.sessionPreset = AVCaptureSessionPreset1280x720;
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
// Add the video frame output
self.output = [[AVCaptureVideoDataOutput alloc] init];
[self.output setAlwaysDiscardsLateVideoFrames:YES];
self.output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// A dispatch queue to get frames
dispatch_queue_t queue;
queue = dispatch_queue_create("frame_queue", NULL);
// Setup the frame rate
AVCaptureConnection *conn = [self.output connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoMinFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
[conn setVideoMaxFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
// Setup input and output and set the delegate to self
[self.output setSampleBufferDelegate:self queue:queue];
[self.session addInput:self.input];
[self.session addOutput:self.output];
// Start the session
[self.session startRunning];
我使用下面的“didOutputSampleBuffer”委托实现捕获帧:
// The delegate method where we get our image data frames from
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Extract a UImage
CVPixelBufferRef pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixel_buffer];
// Capture the image
CGImageRef ref = [self.context createCGImage:ciImage fromRect:ciImage.extent];
// This sets the captured image orientation correctly
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:UIImageOrientationLeft];
// Release the CGImage
CGImageRelease(ref);
// Update the UI on the main thread but throttle the processing
[self performSelectorOnMainThread:@selector(updateUIWithCapturedImageAndProcessWithImage:) withObject:image waitUntilDone:YES];
}
答案 0 :(得分:1)
这是一个部分答案:我认为iOS 5和iOS 6之间的Quicktime视频捕获引擎发生了变化。在iOS 5中,可以以60 FPS捕获视频,并且有一些应用程序可以制作使用此功能录制以平滑慢动作播放(例如SloPro app)。在iOS 6中,使用相同的方法不再可能达到60 FPS。在MacRumors论坛主题中对此问题进行了长时间的讨论:
Will jailbreak of iPhone 4S, iOS 6.1, allow recording of 60 FPS video?
希望您能在那里找到一些可能为您的问题提供解决方案的信息。我很想知道是否有人能再次完成这项工作。我错过了60 FPS的录音......
答案 1 :(得分:1)
我不确定你在运行什么iOS,但是将代码包括在内:
AVCaptureConnection *conn = [self.output connectionWithMediaType:AVMediaTypeVideo];
if ([conn isVideoMaxFrameDurationSupported] && [conn isVideoMinFrameDurationSupported])
{
[conn setVideoMinFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
[conn setVideoMaxFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
}
else
NSLog(@"Setting Max and/or Min frame duration is unsupported";
然后从那里开始。我怀疑它在iOS上不受支持。
答案 2 :(得分:0)
OP似乎已经知道如何设置帧频,但不知道为什么代码无法正常工作。
在将输入和输出添加到捕获会话AVCaptureConnection documentation之前,不会创建AVCaptureConnection。
所以我怀疑'conn'为空。将“设置帧速率”中的代码移到“设置输入和输出并将代理设置为自已”之后。