AVAudioPlayerNode在iOS11中调度缓冲区和音频路由更改

时间:2017-11-30 07:18:21

标签: ios11 avaudioplayernode

我发现iOS 9/10和iOS 11之间存在不同的行为,因为当音频路由发生变化时(例如插入耳机),AVAudioPlayerNode上将来安排的缓冲区。有没有人经历过类似的事情,你是如何解决它的?请注意,我差不多两周前在Apple的AVFoundation支持论坛上报告了这个问题,并且收到的回复都是零。

显示此问题的代码如下所示 - 首先简要说明:代码是一个简单的循环,它会在将来重复调度缓冲区以便一次播放。通过调用'runSequence'方法启动该过程,该方法将调度音频缓冲区以便在将来播放,并将完成回调设置为嵌套方法'audioCompleteHandler'。完成回调再次调用'runSequence'方法,该方法调度另一个缓冲区并保持进程永远进行。在这种情况下,除了执行完成处理程序之外,总是会调度缓冲区。各种地方的“跟踪”方法只是在调试时才能进行打印的内部方法,因此可以忽略。

在音频路由更改通知处理程序(handleAudioRouteChange)中,当新设备变为可用时(case .newDeviceAvailable),代码重新启动引擎和播放器,重新激活音频会话并调用'runSequence'以使循环恢复生命

这一切在iOS 9.3.5(iPhone 5C)和iOS 10.3.3(iPhone 6)上运行良好,但在iOS 11.1.1(iPad Air)上失败。失败的本质是AVAudioPlayerNode不播放音频,而是立即调用完成处理程序。这导致失控的情况。如果我删除再次启动循环的行(如代码中所示),它在iOS 11.1.1上运行正常,但在iOS 9.3.5和iOS 10.3.3上失败。这种失败是不同的:音频只是停止,在调试器中,我可以看到循环没有循环。

因此,一个可能的解释是,在iOS 9.x和iOS 10.x下,当在iOS 11.x下未发生未调度的未来调度缓冲区时发生音频路由更改时,未调度的未调度缓冲区将被取消调度。

这导致两个问题:  1.有没有人看到过与此类似的行为以及解决方案是什么?  2.有人能指出我的文档描述了当音频路径发生变化(或音频中断)时引擎,播放器和会话的确切状态吗?

private func runSequence() {

    // For test ony
    var timeBaseInfo = mach_timebase_info_data_t()
    mach_timebase_info(&timeBaseInfo)
    // End for test only

    let audioCompleteHandler = { [unowned self] in
        DispatchQueue.main.async {
            trace(level: .skim, items: "Player: \(self.player1.isPlaying), Engine: \(self.engine.isRunning)")
            self.player1.stop()
            switch self.runStatus {
            case .Run:
                self.runSequence()
            case .Restart:
                self.runStatus = .Run
                self.tickSeq.resetSequence()
                //self.updateRenderHostTime()
                self.runSequence()
            case .Halt:
                self.stopEngine()
                self.player1.stop()
                self.activateAudioSession(activate: false)
            }
        }
    }

    // Schedule buffer...
    if self.engine.isRunning {
        if let thisElem: (buffer: AVAudioPCMBuffer, duration: Int) = tickSeq.next() {
            self.player1.scheduleBuffer(thisElem.buffer, at: nil, options: [], completionHandler: audioCompleteHandler)
            self.player1.prepare(withFrameCount: thisElem.buffer.frameLength)
            self.player1.play(at: AVAudioTime(hostTime: self.startHostTime))
            self.startHostTime += AVAudioTime.hostTime(forSeconds: TimeInterval(Double(60.0 / Double(self.model.bpm.value)) * Double(thisElem.duration)))
            trace(level: .skim, items:
                "Samples: \(thisElem.buffer.frameLength)",
                "Time: \(mach_absolute_time() * (UInt64(timeBaseInfo.numer) / UInt64(timeBaseInfo.denom))) ",
                "Sample Time: \(player1.lastRenderTime!.hostTime)",
                "Play At: \(self.startHostTime) ",
                "Player: \(self.player1.isPlaying)",
                "Engine: \(self.engine.isRunning)")
        }
        else {
        }
    }
}


@objc func handleAudioRouteChange(_ notification: Notification) {

    trace(level: .skim, items: "Route change: Player: \(self.player1.isPlaying) Engine: \(self.engine.isRunning)")
    guard let userInfo = notification.userInfo,
        let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
        let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }

    trace(level: .skim, items: audioSession.currentRoute, audioSession.mode)
    trace(level: .none, items: "Reason Value: \(String(describing: userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt)); Reason: \(String(describing: AVAudioSessionRouteChangeReason(rawValue:reasonValue)))")

    switch reason {
    case .newDeviceAvailable:
        trace(level: .skim, items: "In handleAudioRouteChange.newDeviceAvailable")
        for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
            startEngine()
            player1.play()
            activateAudioSession(activate: true)
            //updateRenderHostTime()
            runSequence() // <<--- Problem: works for iOS9,10; fails on iOS11. Remove it and iOS9,10 fail, works on iOS11
        }
    case .oldDeviceUnavailable:
        trace(level: .skim, items: "In handleAudioRouteChange.oldDeviceUnavailable")
        if let previousRoute =
            userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
            for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
                player1.stop()
                stopEngine()
                tickSeq.resetSequence()
                DispatchQueue.main.async {
                    if let pp = self.playPause as UIButton? { pp.isSelected = false }
                }
           }
        }

1 个答案:

答案 0 :(得分:3)

因此,通过进一步的挖掘/测试解决了问题:

  • 当AVAudioSession发出路线更改通知时,iOS 9/10和iOS 11之间的行为有所不同。在通知处理程序中,引擎状态未运行(engine.isRunning == false)大约90%的时间用于iOS 9/10,而对于iOS 11,引擎状态始终在运行(engine.isRunning == true)
  • 对于iOS 9/10表示引擎 正在运行的10%的时间(engine.isRunning == true),实际上并非如此。无论使用什么引擎,引擎都不会运行。运行说
  • 由于引擎已在iOS 9/10中停止,因此先前准备好的音频已经释放,只需重启引擎即可启动音频;你必须在发动机停止的采样点重新安排文件或缓冲区。可悲的是,当发动机停止时(发动机返回零)你无法找到当前的采样时间,所以你必须:

    • 启动引擎
    • 抓住样本时间并在持久属性中累积它(+ =)
    • 停止播放器
    • 从抓取的样本时间开始重新安排音频(并准备好)
    • 启动播放器
  • iOS 9/10中的引擎状态与插入的耳机(.newDeviceAvailable)和耳机已移除的情况(.oldDeviceUnavailable)相同,因此您需要对移除的情况执行类似的操作同样需要累积样本时间,以便您可以从停止的位置重新启动音频,因为player.stop()会将采样时间重置为0)

  • iOS 11不需要这些,但下面的代码适用于iOS 9/10和11,所以它可能最适合所有版本的同样方式

以下代码适用于我的测试设备,适用于iOS 9.3.5(iPhone 5C),iOS 10.3.3(iPhone 6)和iOS 11.1.1(iPad Air)(但我仍然感到困惑的是我可以找不到关于如何正确处理路线变化的先前评论,并且必须有数百人遇到过这个问题。??通常情况下,当我找不到任何关于某个主题的评论时,我认为我在做某事错了或者只是没有得到它......哦......好的......):

@objc func handleAudioRouteChange(_ notification: Notification) {

    guard let userInfo = notification.userInfo,
        let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
        let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }

    switch reason {
    case .newDeviceAvailable:

        for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
            headphonesConnected = true
        }

        startEngine()   // Do this regardless of whether engine.isRunning == true

        if let lrt = player.lastRenderTime, let st = player.playerTime(forNodeTime: lrt)?.sampleTime {
            playSampleOffset += st  // Accumulate so that multiple inserts/removals move the play point forward
            stopPlayer()
            scheduleSegment(file: playFile, at: nil, player: player, start: playSampleOffset, length: AVAudioFrameCount(playFile.length - playSampleOffset))
            startPlayer()
        }
        else {
            // Unknown problem with getting sampleTime; reset engine, player(s), restart as appropriate
        }

    case .oldDeviceUnavailable:
        if let previousRoute =
            userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
            for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
                headphonesConnected = false
            }
        }

        startEngine()   // Do this regardless of whether engine.isRunning == true

        if let lrt = player.lastRenderTime, let st = player.playerTime(forNodeTime: lrt)?.sampleTime  {
            playSampleOffset += st  // Accumulate...
            stopPlayer()
            scheduleSegment(file: playFile, at: nil, player: player, start: playSampleOffset, length: AVAudioFrameCount(playFile.length - playSampleOffset))
            startPlayer()   // Test only, in reality don't restart here; set play control to allow user to start audio
        }
        else {
            // Unknown problem with getting sampleTime; reset engine, player(s), restart as appropriate
        }

...