此帖子也发布在The Amazing Audio Engine forum。
大家好,我是The Amazing Audio Engine和iOS dev的新手,并且一直试图弄清楚如何获得赛道的BPM。
到目前为止,我在论坛上发现了两篇关于离线渲染的文章:
据我所知,AEAudioControllerRenderMainOutput
函数只能在this fork中正确实现。
我正在尝试进行离线渲染以处理曲目,然后使用here(JavaScript)描述的算法并实施here。
到目前为止,我正在加载这个分叉,我正在使用Swift(我现在是Make School Summer Academy的一员,教授Swift)。
播放曲目时,此代码适用于我(无离线渲染!)
let file = NSBundle.mainBundle().URLForResource("track", withExtension:
"m4a")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())
let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in
let leftSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
// Advance the buffer sizeof(float) * 512
let rightSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples) rightSamples: \(rightSamples)")
}
audioController.addChannels([channel])
audioController.addOutputReceiver(receiver)
audioController.start()
尝试离线呈现
这是我在使用this fork
时尝试运行的代码audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription())
let file = NSBundle.mainBundle().URLForResource("track", withExtension: "mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController.addChannels([channel])
audioController.start(nil)
audioController.stop()
var t = AudioTimeStamp()
let bufferLength: UInt32 = 4096
var buffer = AEAllocateAndInitAudioBufferList(audioController.audioDescription, Int32(bufferLength))
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer)
var renderDuration: NSTimeInterval = channel.duration
var sampleRate: Float64 = audioController.audioDescription.mSampleRate
var lengthInFrames: UInt32 = UInt32(renderDuration * sampleRate)
var songBuffer: [Float64]
t.mFlags = UInt32(kAudioTimeStampSampleTimeValid)
var frequencyAnalyzer = FrequencyAnalyzer()
println("renderDuration \(renderDuration)")
var outIsOpen = Boolean()
AUGraphClose(audioController.audioGraph)
AUGraphIsOpen(audioController.audioGraph, &outIsOpen)
println("AUGraphIsOpen: \(outIsOpen)")
for (var i: UInt32 = 0; i < lengthInFrames; i += bufferLength) {
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer);
t.mSampleTime += Float64(bufferLength)
println(t.mSampleTime)
let leftSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData)
let rightSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples.memory) rightSamples: \(rightSamples.memory)")
}
AEFreeAudioBufferList(buffer)
AUGraphOpen(audioController.audioGraph)
audioController.start(nil)
audioController.stop()
离线渲染对我来说不起作用。第二个例子没有给我带来很多我不理解的混合错误。
一个非常常见的是在这一行的channelAudioProducer
函数内:
// Tell mixer/mixer's converter unit to render into audio status = AudioUnitRender(group->converterUnit ? group->converterUnit : group->mixerAudioUnit, arg->ioActionFlags, &arg->originalTimeStamp, 0, *frames, audio);
它给了我EXC_BAD_ACCESS (code=EXC_I386_GPFLT)
。除了其他错误之外,这个错误很常见。
对不起,我是这个领域的总菜鸟,但有些东西我真的不懂。我应该使用nonInterleaved16BitStereoAudioDescription
还是nonInterleavedFloatStereoAudioDescription
?这是如何实现mData
的?
我很想得到一些帮助,因为我此刻已经迷失了。请你在回答我的时候尝试尽可能充分地解释它,我是这个人的新手。
注意:如果您不了解Swift,可以在Objective-C中发布代码。