管理AudioKit生命周期的正确方法是什么?

时间:2017-05-30 09:52:02

标签: ios swift core-audio audiokit

我正在构建一个必须跟踪用户麦克风输入幅度的应用。 AudioKit有一堆方便的对象满足我的需求:AKAmplitudeTracker等。我还没有找到关于如何启动AudioKit,开始跟踪等的可行信息。

现在,与AudioKit初始化相关的所有代码都在录音机模块的根VC的viewDidLoad方法中。这是不正确的,因为随机错误发生,我无法追踪错误。下面的代码显示了我现在如何使用AudioKit。

var silence: AKBooster!
  var tracker: AKAmplitudeTracker!
    var mic: AKMicrophone!

      ...

      override func viewDidLoad() {
        super.viewDidLoad()

        switch AVAudioSession.sharedInstance().recordPermission() {

            case AVAudioSessionRecordPermission.granted:

              self.mic = AKMicrophone()
              self.tracker = AKAmplitudeTracker(self.mic)
              AKSettings.audioInputEnabled = true
              AudioKit.output = self.tracker
              AudioKit.start()
              self.mic.start()
              self.tracker.start()

              break

            case AVAudioSessionRecordPermission.undetermined:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }
            case AVAudioSessionRecordPermission.denied:

              AVAudioSession.sharedInstance().requestRecordPermission {
                (granted) in

                if granted {

                  self.mic = AKMicrophone()
                  self.tracker = AKAmplitudeTracker(self.mic)
                  AKSettings.audioInputEnabled = true
                  AudioKit.output = self.tracker
                  AudioKit.start()
                  self.mic.start()
                  self.tracker.start()

                }

              }


            default:
              print("")
          }

          ...

      }

请帮我弄清楚如何正确管理AudioKit。

2 个答案:

答案 0 :(得分:2)

从我所看到的,它看起来应该正常工作,代码中的其他地方可能会发生一些事情。我做了一个精简版演示来测试基础知识,它的工作原理。我只是添加了一个计时器来轮询振幅。

import UIKit
import AudioKit

class ViewController: UIViewController {

    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    override func viewDidLoad() {
        super.viewDidLoad()

        mic = AKMicrophone()
        tracker = AKAmplitudeTracker(mic)
        AudioKit.output = tracker
        AudioKit.start()

        Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { (timer) in
            print(self.tracker.amplitude)
        }
    }
}

答案 1 :(得分:1)

阿列克谢,

我对管理AudioKit生命周期的建议是将其置于单一类中。这就是它在回购中包含的一些AudioKit示例中的设置方式,例如Analog Synth XDrums。这样,它不会绑定到特定的ViewController viewDidLoad,并且可以从多个ViewControllers或管理应用程序状态的AppDelegate访问。它还确保您只创建一个实例。

这是一个示例,其中AudioKit在名为Conductor的类中初始化(也可以称为AudioManager等):

import AudioKit
import AudioKitUI

// Treat the conductor like a manager for the audio engine.
class Conductor {

    // Singleton of the Conductor class to avoid multiple instances of the audio engine
    static let sharedInstance = Conductor()

    // Create instance variables
    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    // Add effects
    var delay: AKDelay!
    var reverb: AKCostelloReverb!

    // Balance between the delay and reverb mix.
    var reverbAmountMixer = AKDryWetMixer()

    init() {

        // Allow audio to play while the iOS device is muted.
        AKSettings.playbackWhileMuted = true

        AKSettings.defaultToSpeaker = true

        // Capture mic input
        mic = AKMicrophone()

        // Pull mic output into the tracker node.
        tracker = AKAmplitudeTracker(mic)

        // Pull the tracker output into the delay effect node.
        delay = AKDelay(tracker)
        delay.time = 2.0
        delay.feedback = 0.1
        delay.dryWetMix = 0.5

        // Pull the delay output into the reverb effect node.
        reverb = AKCostelloReverb(delay)
        reverb.presetShortTailCostelloReverb()

        // Mix the amount of reverb to the delay output node.
        reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)

        // Assign the reverbAmountMixer output to be the final audio output
        AudioKit.output = reverbAmountMixer

        // Start the AudioKit engine
        // This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
        startAudioEngine()

    }

    internal func startAudioEngine() {
        AudioKit.start()
        print("Audio engine started")
    }

    internal func stopAudioEngine() {
        AudioKit.stop()
        print("Audio engine stopped")
    }
}

以下是如何从ViewController访问Conductor单线类中发生的幅度跟踪数据:

import UIKit

class ViewController: UIViewController {

    var conductor = Conductor.sharedInstance

    override func viewDidLoad() {
        super.viewDidLoad()

        Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
            print(self.conductor.tracker.amplitude)
        }

    }
}

您可以从此处下载此GitHub回购:

https://github.com/markjeschke/AudioKit-Amplitude-Tracker

我希望这会有所帮助。

保重,
标记