我正在尝试从CMSampleBuffer
的{{1}}返回的captureOutput
读取频率值。
这个想法是创建一个AVCaptureAudioDataOutputSampleBufferDelegate
,以便我可以阅读它的AVAudioPCMBuffer
。但是我不确定如何将缓冲区传递给它。
我想我可以用:
floatChannelData
但是我该如何填充其数据?
答案 0 :(得分:0)
遵循这些原则应该会有所帮助:
var asbd = CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription!
var audioBufferList = AudioBufferList()
var blockBuffer : CMBlockBuffer?
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
sampleBuffer,
bufferListSizeNeededOut: nil,
bufferListOut: &audioBufferList,
bufferListSize: MemoryLayout<AudioBufferList>.size,
blockBufferAllocator: nil,
blockBufferMemoryAllocator: nil,
flags: 0,
blockBufferOut: &blockBuffer
)
let mBuffers = audioBufferList.mBuffers
let frameLength = AVAudioFrameCount(Int(mBuffers.mDataByteSize) / MemoryLayout<Float>.size)
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat(streamDescription: &asbd)!, frameCapacity: frameLength)!
pcmBuffer.frameLength = frameLength
pcmBuffer.mutableAudioBufferList.pointee.mBuffers = mBuffers
pcmBuffer.mutableAudioBufferList.pointee.mNumberBuffers = 1
这似乎在捕获会话的末尾创建了一个有效的AVAudioPCMBuffer。但是对于我的用例来说,它的帧长度不正确,因此需要做一些进一步的缓冲。