我想从AVCaptureSession的实时馈送中提取帧,我使用Apple的AVCam作为测试用例。这是AVCam的链接:
https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
我发现captureOutput:didOutputSampleBuffer:fromConnection
没有被调用,我想知道为什么或我做错了什么。
这就是我所做的:
(1)我让AVCamViewController
成为代表
@interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>
(2)我创建了一个AVCaptureVideoDataOutput
对象并将其添加到会话
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput])
{
[session addOutput:videoDataOutput];
}
(3)我通过记录随机字符串来测试
添加了委托方法和测试- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"I am called");
}
测试应用程序有效,但是captureOutput:didOutputSampleBuffer:fromConnection未被调用。
(4)我在SO上读到AVCaptureSession *session = [[AVCaptureSession alloc] init];
中的会话变量在viewDidLoad中是本地的,这可能是为什么没有调用委托的原因,我把它变成了AVCamViewController类的实例变量,但它不是调用。
这是我正在测试的viewDidLoad方法(取自AVCam),我在方法结束时添加了AVCaptureDataOutput:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];
if ([session canAddOutput:videoDataOutput])
{
NSLog(@"Yes I can add it");
[session addOutput:videoDataOutput];
}
});
}
- (void)viewWillAppear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
[self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak AVCamViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
AVCamViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
[[strongSelf recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal];
});
}]];
[[self session] startRunning];
});
}
有人可以告诉我为什么以及如何解决它的建议?
答案 0 :(得分:6)
我已经做了很多实验,我想我的答案很可能。我有类似但不同的代码,这些代码是从头开始编写的,而不是从Apple的样本中复制的(现在有点旧)。
我认为这是一节......
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
从我的实验中,这是引起问题的因素。在我的代码中,当它出现时captureOutput:didOutputSampleBuffer:fromConnection
未被调用。我认为视频系统EITHER为您提供了一系列样本缓冲区,或者将压缩的优化电影文件记录到磁盘,而不是两者。 (至少在iOS上。)我想这很有意义/并不奇怪,但我没有看到它在任何地方都有记录!
此外,有一次,当我打开麦克风时,我似乎遇到了错误和/或缓冲区回调。再次没有记录,这些是错误-11800(未知错误)。但我不能总是重现那个。
答案 1 :(得分:1)
你的代码对我来说很好看,我可以想到你可以尝试的10个猜测和检查的东西,所以我会采取不同的方法,希望间接解决问题。除了我认为AVCam写得不好的事实,我认为最好还是看一个只关注实时视频而不是录制视频和拍摄静态图像的例子。我已经提供了一个例子来做到这一点,而不是更多。
-(void)startSession {
self.session = [AVCaptureSession new];
self.session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in [AVCaptureDevice devices]) {
if ([device hasMediaType:AVMediaTypeVideo] && device.position == AVCaptureDevicePositionBack) {
backCamera = device;
break;
}
}
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (error) {
// handle error
}
if ([self.session canAddInput:input]) {
[self.session addInput:input];
}
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
[output setSampleBufferDelegate:self queue:self.queue];
output.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)};
if ([self.session canAddOutput:output]) {
[self.session addOutput:output];
}
dispatch_async(self.queue, ^{
[self.session startRunning];
});
}
答案 2 :(得分:0)
当我在React-Native和本机iOS / Swif / ObjectiveC之间的桥梁上工作时,我遇到了同样的问题。
然后我发现了2个类似的问题。 @ Carl的答案似乎确实是正确的。 然后我找到了其他问题的答案:
我在Apple的支持下联系了一位工程师,他告诉我不支持同时使用AVCaptureVideoDataOutput + AVCaptureMovieFileOutput。我不知道他们将来是否会支持它,但他使用了“#34;此时不支持&#34;”。
我鼓励你像我一样填写错误报告/功能请求(bugreport.apple.com),因为它们衡量人们想要的东西,我们也许可以在不久的将来看到这一点。
Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput