我通过使用cameraOverlayView选项在其上放置视频来录制视频,而录制时则显示我的视图,但当我尝试保存并观看时,视图不会出现。
有人可以帮我解决这个问题吗?
提前致谢。
答案 0 :(得分:1)
我担心这不会那么容易。您必须使用AVCaptureSession类实际捕获单个帧。然后,您可以在捕获图像时将叠加视图合成到图像上,然后将合成图提供给AVCaptureDevice。
它非常复杂。以下是一些用于设置捕获的代码,以帮助您入门:
// Create and configure a capture session and start it running
- (void)setupCaptureSession { NSError * error = nil;
// Create the session
session = [[AVCaptureSession alloc] init]; // note we never release this...leak?
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetLow; // adjust this! AVCaptureSessionPresetLow
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
NSLog(@"Yikes, null input");
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
output.alwaysDiscardsLateVideoFrames = YES; // cribbed this from somewhere -- seems related to our becoming unrepsonsive
[session addOutput:output];
if (!output) {
// Handling the error appropriately.
NSLog(@"ERROROROROR");
}
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
// kCVPixelFormatType_32RGBA or kCVPixelFormatType_32BGRA
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, VIDEO_CAPTURE_FRAMERATE); // WATCH THIS!
NSNotificationCenter *notify = [NSNotificationCenter defaultCenter];
[notify addObserver: self selector: @selector(onVideoError:) name: AVCaptureSessionRuntimeErrorNotification object: session];
[notify addObserver: self selector: @selector(onVideoInterrupted:) name: AVCaptureSessionWasInterruptedNotification object: session];
[notify addObserver: self selector: @selector(onVideoEnded:) name: AVCaptureSessionInterruptionEndedNotification object: session];
[notify addObserver: self selector: @selector(onVideoDidStopRunning:) name: AVCaptureSessionDidStopRunningNotification object: session];
[notify addObserver: self selector: @selector(onVideoStart:) name: AVCaptureSessionDidStartRunningNotification object: session];
}