如何在iOS中捕获音频语音并获取捕获的音频帧

时间:2020-10-16 07:50:29

标签: objective-c swift avcapturesession avcapturedevice avcapturemoviefileoutput

我正在尝试在iOS中捕获/记录语音并同时获取帧。 我尝试如下代码

-(void)startRecord{
    NSError *error = nil;
  //Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
          session.sessionPreset = AVCaptureSessionPresetMedium;
     AVCaptureDevice *audioDevice=[AVCaptureDevice 
      defaultDeviceWithMediaType:AVMediaTypeAudio];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput 
     deviceInputWithDevice:audioDevice  error:&error];
    if (!input) {
        // Handling the error appropriately.
     }
     [session addInput:input];
     AVCaptureAudioDataOutput *output = [[AVCaptureAudioDataOutput alloc]init];
          [session addOutput:output];
     dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
       [output setSampleBufferDelegate:self queue:queue];
    [session startRunning];
}


// Delegate routine that is called when a sample buffer was written
  - (void)captureOutput:(AVCaptureOutput *)captureOutput
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
          fromConnection:(AVCaptureConnection *)connection
     {
         NSLog(@"sample buffer called");
         // Handle sample buffer frame here
    }

但是在这里,我没有接收到带有缓冲帧的委托方法事件。 我需要把这个捕捉语音帧并传递给其他方法。

0 个答案:

没有答案