未由AVCaptureAudioDataOutputSampleBufferDelegate调用captureOutput

时间:2018-07-28 16:05:35

标签: ios swift avfoundation

我有一个可以录制视频的应用程序,但是我需要它向用户实时显示在麦克风上捕获的声音的音高。我已经能够使用AVCaptureSession成功地将音频和视频录制到MP4。但是,当我在会话中添加AVCaptureAudioDataOutput并分配AVCaptureAudioDataOutputSampleBufferDelegate时,我没有收到任何错误,但是会话开始后就永远不会调用captureOutput函数。

代码如下:

import UIKit
import AVFoundation
import CoreLocation


class ViewController: UIViewController, 
AVCaptureVideoDataOutputSampleBufferDelegate, 
AVCaptureFileOutputRecordingDelegate, CLLocationManagerDelegate , 
AVCaptureAudioDataOutputSampleBufferDelegate {

var videoFileOutput: AVCaptureMovieFileOutput!
let session = AVCaptureSession()
var outputURL: URL!
var timer:Timer!
var locationManager:CLLocationManager!
var currentMagnitudeValue:CGFloat!
var defaultMagnitudeValue:CGFloat!
var visualMagnitudeValue:CGFloat!
var soundLiveOutput: AVCaptureAudioDataOutput!


override func viewDidLoad() {
    super.viewDidLoad()
    self.setupAVCapture()
}


func setupAVCapture(){

    session.beginConfiguration()

    //Add the camera INPUT to the session
    let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
                                              for: .video, position: .front)
    guard
        let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!),
        session.canAddInput(videoDeviceInput)
        else { return }
    session.addInput(videoDeviceInput)

    //Add the microphone INPUT to the session
    let microphoneDevice = AVCaptureDevice.default(.builtInMicrophone, for: .audio, position: .unspecified)
    guard
        let audioDeviceInput = try? AVCaptureDeviceInput(device: microphoneDevice!),
        session.canAddInput(audioDeviceInput)
        else { return }
    session.addInput(audioDeviceInput)

    //Add the video file OUTPUT to the session
    videoFileOutput = AVCaptureMovieFileOutput()
    guard session.canAddOutput(videoFileOutput) else {return}
    if (session.canAddOutput(videoFileOutput)) {
        session.addOutput(videoFileOutput)
    }

    //Add the audio output so we can get PITCH of the sounds
    //AND assign the SampleBufferDelegate
    soundLiveOutput = AVCaptureAudioDataOutput()
    soundLiveOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "test"))
    if (session.canAddOutput(soundLiveOutput)) {
        session.addOutput(soundLiveOutput)
        print ("Live AudioDataOutput added")
    } else
    {
        print("Could not add AudioDataOutput")
    }



    //Preview Layer
    let previewLayer = AVCaptureVideoPreviewLayer(session: session)
    let rootLayer :CALayer = self.cameraView.layer
    rootLayer.masksToBounds=true
    previewLayer.frame = rootLayer.bounds
    rootLayer.addSublayer(previewLayer)
    previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill;

    //Finalize the session
    session.commitConfiguration()

   //Begin the session
    session.startRunning()


}

func captureOutput(_: AVCaptureOutput, didOutput: CMSampleBuffer, from: 
AVCaptureConnection) {
    print("Bingo")
}

}

预期输出:

Bingo
Bingo
Bingo
...

我读过:

StackOverflow: captureOutput not being called-用户未正确声明captureOutput方法。

StackOverflow: AVCaptureVideoDataOutput captureOutput not being called-用户根本没有声明captureOutput方法。

Apple - AVCaptureAudioDataOutputSampleBufferDelegate-Apple关于委托的文档及其方法-该方法与我声明的方法匹配。

我在网上遇到的其他常见错误:

  • 使用旧版Swift的声明(我正在使用v4.1)
  • 显然在Swift 4.0之后的一篇文章中,AVCaptureMetadataOutput替代了AVCaptureAudioDataOutput-尽管我在Apple文档中找不到此内容,但我也尝试过这样做,但是类似地,metadataOutput函数是没来过。

我没有新主意。我缺少明显的东西吗?

3 个答案:

答案 0 :(得分:1)

您正在使用的方法已与此方法进行了更新, AVCaptureAudioDataOutput AVCaptureVideoDataOutput 都会被调用。您确保在将示例缓冲区写入资产编写器之前检​​查输出。

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    //Make sure you check the output before using sample buffer
    if output == audioDataOutput {
      //Use sample buffer for audio 
   }
}

答案 1 :(得分:0)

好吧,没有人回过头来,但是在玩了之后,我得出了正确的方法来声明Swift4的captureOutput方法如下:

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    //Do your stuff here
}

不幸的是,此在线文档非常贫乏。我猜您只需要完全正确就可以了-如果您将变量拼写错误或命名错误,则不会引发任何错误,因为它是一个可选函数。

答案 2 :(得分:0)

对我来说,问题出在这里,而AVAudioSession和AVCaptureSession被声明为局部变量,而当我启动会话时,它就消失了。一旦将它们移到类级别的变量上,一切就很好了!