我有4个AKPlayer
个节点,每个节点连接到一些效果,最后它们混合在一起。
我想离线渲染iOS的输出> 9.0但我无法弄清楚如何。
编辑:我已经实现了渲染并将其分离为iOS> 11 虽然iOS> 11 renderToFile似乎表现不错,但对于iOS< 11,渲染文件有一些滞后并在几秒钟后向前跳跃,最终导致无声。
这是我的渲染功能:
do{
if #available(iOS 11, *) {
let outputFile = try AKAudioFile(forWriting: url, settings: [:])
_ = AudioKit.engine.isRunning
try AudioKit.renderToFile(outputFile, duration: karaPlayer.duration, prerender: {
self.seekTo(time: 0)
})
}else {
let offlineNode = AKOfflineRenderNode(self.mixer)
AudioKit.output = offlineNode
offlineNode.internalRenderEnabled = false
try AudioKit.start()
self.seekTo(time: 0)
try offlineNode.renderToURL(url, duration: self.karaPlayer.duration)
self.karaPlayer.stop()
self.voicePlayer.stop()
offlineNode.internalRenderEnabled = true
}
} catch {
print(error)
print("Couldn't render output file")
}
混音器的输入是从“.caf”文件读取的2个AKPlayer节点。
答案 0 :(得分:3)
如果您想使用iOS11 +,您可以使用AudioKit的渲染文件:
/// Render output to an AVAudioFile for a duration.
/// - Parameters
/// - audioFile: An file initialized for writing
/// - seconds: Duration to render
/// - prerender: A closure called before rendering starts, use this to start players, set initial parameters, etc...
///
@available(iOS 11.0, macOS 10.13, tvOS 11.0, *)
public func renderToFile(_ audioFile: AVAudioFile, seconds: Double, prerender: (() -> Void)? = nil) throws {
如果您需要支持iOS9 +,请使用AKOfflineRenderNode的渲染到URL功能:
@available(iOS,已废弃:11) @available(tvOS,已废弃:11) @available(macOS,已废弃:10.13) 开放类AKOfflineRenderNode:AKNode,AKComponent,AKInput {
public typealias AKAudioUnitType = AKOfflineRenderAudioUnit
public static let ComponentDescription = AudioComponentDescription(effect: "mnrn")
private var internalAU: AKAudioUnitType?
open var internalRenderEnabled: Bool {
get { return internalAU!.internalRenderEnabled }
set { internalAU!.internalRenderEnabled = newValue }
}
open func renderToURL(_ url: URL, seconds: Double, settings: [String: Any]? = nil) throws {
return try internalAU!.render(toFile: url, seconds: seconds, settings: settings)
}