委托AVCaptureVideoDataOutput和AVCaptureAudioDataOutput

时间:2012-12-31 14:06:59

标签: objective-c ios xcode

我尝试从CMSampleBufferRefAVCaptureVideoDataOutput获得AVCaptureAudioDataOutput

AVCamRecorder.h

#import <AVFoundation/AVFoundation.h>

@interface AVCamRecorder : NSObject {
}
    @property (nonatomic,retain) AVCaptureVideoDataOutput *videoDataOutput;
    @property (nonatomic,retain) AVCaptureAudioDataOutput *audioDataOutput;

@end

AVCamRecorder.m

#import "AVCamRecorder.h"
#import <AVFoundation/AVFoundation.h>

@interface AVCamRecorder (VideoDataOutputDelegate) <AVCaptureVideoDataOutputSampleBufferDelegate>
@end
@interface AVCamRecorder (AudioDataOutputDelegate) <AVCaptureAudioDataOutputSampleBufferDelegate>
@end


-(id)initWithSession:(AVCaptureSession *)aSession
{

    self = [super init];
    if (self != nil) {

        //AudioDataoutput
        AVCaptureAudioDataOutput *aAudioDataOutput =  [[AVCaptureAudioDataOutput alloc] init];

        //VideoDataoutput
        AVCaptureVideoDataOutput *aMovieDataOutput = [[AVCaptureVideoDataOutput alloc] init];


        if ([aSession canAddOutput:aAudioDataOutput]) {
            [aSession addOutput:aAudioDataOutput];
        }        
        if ([aSession canAddOutput:aMovieDataOutput]) {
        [aSession addOutput:aMovieDataOutput];
        }

        [aAudioDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
        [aMovieDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

        [self setAudioDataOutput:aAudioDataOutput];
        [self setVideoDataOutput:aMovieDataOutput];

        [self setSession:aSession];

    }
    return self;
}

@implementation AVCamRecorder (VideoDataOutputDelegate)
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"VideoDataOutputDelegate = %@", captureOutput);
}    
@end

@implementation AVCamRecorder (AudioDataOutputDelegate)
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"AudioDataOutputDelegate = %@", captureOutput);
}
@end

奇怪的是,我在“@implementation AVCamRecorder (AudioDataOutputDelegate)

中获得了视频数据
AudioDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0>

我改变了“@implementation AVCamRecorder (VideoDataOutputDelegate)”和“@implementation AVCamRecorder (VideoDataOutputDelegate)”的顺序,我得到了

VideoDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0>

似乎我无法设置2“captureOutput:didOutputSampleBuffer:fromConnection:”。否则,数据会进入其中一个。

或者,我在设置“@implementation AVCamRecorder (VideoDataOutputDelegate)”和“@implementation AVCamRecorder (AudioDataOutputDelegate)”时出错?

我认为我不需要单独回调,但我只是想知道出了什么问题。

提前感谢您的帮助。

1 个答案:

答案 0 :(得分:1)

您已在同一课程中定义了两个类别

AVCamRecorder (VideoDataOutputDelegate)
AVCamRecorder (AudioDataOutputDelegate)

声明相同的方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection;

这会导致 未定义的行为 。请参阅“使用Objective-C编程”指南中的Avoid Category Method Name Clashes

  

如果类别中声明的方法的名称与方法相同   在原始类中,或在另一个类别中的方法相同   class(甚至是超类),行为未定义为哪个   方法实现在运行时使用   ...

所以你的设置无法正常工作。你可以改为

  • 定义两个单独的,一个作为音频,一个作为视频委托,
  • 定义一个充当音频+视频委托的类类别(并检查调用它的函数的回调方法),
  • 只需将AVCamRecorder本身用作音频+视频代表。