AudioKit Synth One在脱机渲染为MIDI时崩溃

时间:2019-07-20 04:00:28

标签: swift audiokit

我有一个自定义的renderToFile函数,用于从Midi进行脱机渲染,但是当我尝试使用Synth One的无皮肤版本进行渲染时(在初始版本中,我们只是使用预设)我们的应用程序),我在S1NoteState::startNoteHelper中崩溃。

[UPDATE]这是我完整的renderToFile函数-与AudioKit基本相同,但是增加了对写入缓冲区时发生的midi事件的检查。另外,我使用AKLazyTap将仪器中的样品填充到缓冲区中。然后将该缓冲区以与AK原始实现相同的方式写入文件。

public func renderToFile(_ audioFile: AVAudioFile, maximumFrameCount: AVAudioFrameCount, duration: Double, prerender: (() -> Void)? = nil) throws {
        guard duration >= 0 else {
            throw NSError(domain: "AVAudioEngine ext", code: 1,
                          userInfo: [NSLocalizedDescriptionKey: "Seconds needs to be a positive value"])
        }
        var duration = duration
        let conductor = Conductor.sharedInstance

        try AKTry {
            // Engine can't be running when switching to offline render mode.
            if AudioKit.engine.isRunning { AudioKit.engine.stop() }
            try AudioKit.engine.enableManualRenderingMode(.offline, format: audioFile.processingFormat, maximumFrameCount: maximumFrameCount)

            // This resets the sampleTime of offline rendering to 0.
            AudioKit.engine.reset()
            try AudioKit.engine.start()
        }

        guard let buffer = AVAudioPCMBuffer(pcmFormat: AudioKit.engine.manualRenderingFormat, frameCapacity: AudioKit.engine.manualRenderingMaximumFrameCount) else {
            throw NSError(domain: "AVAudioEngine ext", code: 1,
                          userInfo: [NSLocalizedDescriptionKey: "Couldn't create buffer in renderToFile"])
        }
        // We'll try using a (local) tap to fill the buffer with samples from the Conductor's current instrument.
        let tap = AKLazyTap(node: conductor.instrumentBooster.avAudioNode)

        // This is for users to prepare the nodes for playing, i.e player.play()
        prerender?()
        let scheduledEvents: [AKMIDIEvent] = conductor.eventsForOfflineRender
        let noteOffs = scheduledEvents.filter { 128 ... 159 ~= $0.internalData[0] }.filter { $0.internalData[2] == 0 }
        let lastOff = noteOffs.sorted { $0.position!.seconds < $1.position!.seconds }.last!.position!.seconds
        duration = lastOff

        // Render until file contains >= target samples
        let targetSamples = AVAudioFramePosition(duration * AudioKit.engine.manualRenderingFormat.sampleRate)
        while audioFile.framePosition < targetSamples {
            // The range, in frames, to render for this window.
            let framesToRender = min(buffer.frameCapacity, AVAudioFrameCount( targetSamples - audioFile.framePosition))
            let windowSampleStart = audioFile.framePosition
            let windowSampleEnd = windowSampleStart + AVAudioFramePosition(framesToRender)

            var eventsInThisBuffer: [AKMIDIEvent] = []
            // Check whether the window represented by (inTimeStamp + inNumberFrames) contains any scehduledEvents. If so, pass those events to the instrument's music device (AudioUnit) during this callback.
            for event in scheduledEvents {
                if windowSampleStart ..< windowSampleEnd ~= Int64(event.position!.samples) {
                    eventsInThisBuffer.append(event)
                }
            }
            // Then pass the collected events to the conductor's current instrument.
            for event in eventsInThisBuffer {
                try conductor.handle(event: event)
            }

            var firstSampleTime = AudioTimeStamp()
            tap?.fillNextBuffer(buffer, timeStamp: &firstSampleTime)

            let status = try AudioKit.engine.renderOffline(framesToRender, to: buffer)
            switch status {
            case .success:
                try audioFile.write(from: buffer)
            case .cannotDoInCurrentContext:
                AKLog("renderToFile cannotDoInCurrentContext")
                continue
            case .error, .insufficientDataFromInputNode:
                throw NSError(domain: "AVAudioEngine ext", code: 1,
                              userInfo: [NSLocalizedDescriptionKey: "renderToFile render error"])
            @unknown default:
                fatalError("Unknown render result")
            }
        }

        try AKTry {
            AudioKit.engine.stop()
            AudioKit.engine.disableManualRenderingMode()
            AudioKit.engine.reset()
            try AudioKit.engine.start()
        }
    }

我们还有一个AKMIDISampler实例,它可以使用此方法正确写入文件,但有一个错误/例外,即它的输出中还包含一个神秘的正弦波(对此不胜感激!)。我在SO上还看到过其他有关正弦波的帖子,但是在我的情况下,那里提到的修复还没有奏效。

但是对于AK Synth One崩溃,我想知道手动渲染模式是否有某些特别之处使事情变得混乱?我曾尝试启用僵尸,但未提供有用的信息。回溯(非常简短)没有揭示我可以看到的任何重要内容:

* thread #14, queue = 'renderQueue', stop reason = EXC_BAD_ACCESS (code=1, address=0x0)
  * frame #0: 0x000000010265c910 Spliqs`S1NoteState::startNoteHelper(this=0x000000010b515730, noteNumber=48, velocity=59, frequency=130.812775) at S1NoteState.mm:105
    frame #1: 0x000000010266e34c Spliqs`S1DSPKernel::turnOnKey(this=0x00000001090e4f90, noteNumber=48, velocity=59, frequency=130.812775) at S1DSPKernel+toggleKeys.mm:85
    frame #2: 0x0000000102661778 Spliqs`S1DSPKernel::startNote(this=0x00000001090e4f90, noteNumber=48, velocity=59, frequency=130.812775) at S1DSPKernel+startStopNotes.mm:45
    frame #3: 0x0000000102658640 Spliqs`::-[S1AudioUnit startNote:velocity:frequency:](self=0x00000001090e4e00, _cmd="startNote:velocity:frequency:", note='0', velocity=';', frequency=130.812775) at S1AudioUnit.mm:98
    frame #4: 0x0000000102c1f500 Spliqs`AKSynthOne.play(noteNumber=48, velocity=59, frequency=130.81277465820313, self=0x0000000281aa40a0) at AKSynthOne.swift:223
    frame #5: 0x0000000102a8db7c Spliqs`Conductor.handleMIDI(data1=144, data2=48, data3=59, self=0x000000010811ccf0) at Conductor.swift:340
    frame #6: 0x0000000102a8d1ac Spliqs`Conductor.handle(event=AudioKit.AKMIDIEvent @ 0x000000016df0e830, self=0x000000010811ccf0) at Conductor.swift:322
    frame #7: 0x0000000102c5861c Spliqs`SQRenderManager.renderToFile(audioFile=0x0000000283be61a0, maximumFrameCount=512, duration=18, prerender=0x0000000102c57064 Spliqs`partial apply forwarder for closure #1 () -> () in Spliqs.SQRenderManager.offlineRender(_: SpliqsLib.Spliq, completion: () -> ()) -> () at <compiler-generated>, self=0x00000002812ba920) at SQRenderManager.swift:167
    frame #8: 0x0000000102c56be0 Spliqs`SQRenderManager.offlineRender(spliq=0x0000000112642a90, completion=0x0000000102c60b80 Spliqs`closure #1 () -> () in closure #1 () -> () in closure #1 (RxSwift.Event<Spliqs.TimedWatchableEvent>) -> () in Spliqs.SQRenderManager.init(withModule: Spliqs.SequencerModule) -> Spliqs.SQRenderManager at SQRenderManager.swift:527, self=0x00000002812ba920) at SQRenderManager.swift:101
    frame #9: 0x0000000102c60b68 Spliqs`closure #1 in closure #1 in SQRenderManager.init(self=0x00000002812ba920, spliq=0x0000000112642a90) at SQRenderManager.swift:527
    frame #10: 0x0000000102685cdc Spliqs`thunk for @escaping @callee_guaranteed () -> () at <compiler-generated>:0
    frame #11: 0x0000000107f0f260 libdispatch.dylib`_dispatch_call_block_and_release + 32
    frame #12: 0x0000000107f107e0 libdispatch.dylib`_dispatch_client_callout + 20
    frame #13: 0x0000000107f13cf8 libdispatch.dylib`_dispatch_continuation_pop + 560
    frame #14: 0x0000000107f130e0 libdispatch.dylib`_dispatch_async_redirect_invoke + 632
    frame #15: 0x0000000107f21a84 libdispatch.dylib`_dispatch_root_queue_drain + 352
    frame #16: 0x0000000107f2248c libdispatch.dylib`_dispatch_worker_thread2 + 144
    frame #17: 0x00000002291a3b50 libsystem_pthread.dylib`_pthread_wqthread + 468
    frame #18: 0x00000002291a9dc4 libsystem_pthread.dylib`start_wqthread + 4

Synth One实例和我们的采样器均由一个Conductor拥有,遵循与Synth One公开发行版相似的设计,不同之处在于我们的Conductor使用AKMIDINode使用的相同方法直接处理midi事件(即,使用enableMIDIhandleMIDI)。除了切换到手动渲染模式或从手动渲染模式切换到其他模式之外,我真的不确定发生了什么。

我会提到我确实发现AK的内部操作有点神秘,因为我们仍然遇到例如启动和停止引擎的问题,这将发现相同的问题-例如,神秘的正弦波和崩溃时使用AK Synth One。

任何想法都值得赞赏。

0 个答案:

没有答案