如何调用CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer?

时间:2014-12-12 12:46:57

标签: audio swift initialization avfoundation sampling

我试图弄清楚如何在Swift中调用此AVFoundation函数。我花了很多时间摆弄声明和语法,并且做到了这一点。编译器大部分都很开心,但我还有最后一个窘境。

public func captureOutput(
    captureOutput: AVCaptureOutput!,
    didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
    fromConnection connection: AVCaptureConnection!
) {
    let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer)
    var audioBufferList: AudioBufferList

    var buffer: Unmanaged<CMBlockBuffer>? = nil

    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
        sampleBuffer,
        nil,
        &audioBufferList,
        UInt(sizeof(audioBufferList.dynamicType)),
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &buffer
    )

    // do stuff
}

编译器抱怨第3和第4个参数:

  

变量的地址&#39; audioBufferList&#39;在初始化之前采取

  

变量&#39; audioBufferList&#39;在初始化之前使用

那我该怎么办呢?

我正在使用this StackOverflow answer,但它是Objective-C。我试图将其翻译成Swift,但遇到了这个问题。

或者可能有更好的方法吗?我需要从缓冲区读取数据,一次一个样本,所以我基本上试图得到一些我可以迭代的样本数组。

6 个答案:

答案 0 :(得分:4)

免责声明:我刚尝试将代码从Reading audio samples via AVAssetReader转换为Swift,并验证它是否已编译。我还没有 测试它是否真的有效。

// Needs to be initialized somehow, even if we take only the address
var audioBufferList = AudioBufferList(mNumberBuffers: 1,
      mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))

var buffer: Unmanaged<CMBlockBuffer>? = nil

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
    sampleBuffer,
    nil,
    &audioBufferList,
    UInt(sizeof(audioBufferList.dynamicType)),
    nil,
    nil,
    UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
    &buffer
)

// Ensure that the buffer is released automatically.
let buf = buffer!.takeRetainedValue() 

// Create UnsafeBufferPointer from the variable length array starting at audioBufferList.mBuffers
let audioBuffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers,
    count: Int(audioBufferList.mNumberBuffers))

for audioBuffer in audioBuffers {
    // Create UnsafeBufferPointer<Int16> from the buffer data pointer
    var samples = UnsafeMutableBufferPointer<Int16>(start: UnsafeMutablePointer(audioBuffer.mData),
        count: Int(audioBuffer.mDataByteSize)/sizeof(Int16))

    for sample in samples {
        // ....
    }
}

答案 1 :(得分:3)

Swift3解决方案:

func loopAmplitudes(audioFileUrl: URL) {

    let asset = AVAsset(url: audioFileUrl)

    let reader = try! AVAssetReader(asset: asset)

    let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]

    let settings = [
        AVFormatIDKey : kAudioFormatLinearPCM
    ]

    let readerOutput = AVAssetReaderTrackOutput(track: track, outputSettings: settings)
    reader.add(readerOutput)
    reader.startReading()

    while let buffer = readerOutput.copyNextSampleBuffer() {

        var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
        var blockBuffer: CMBlockBuffer?

        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
            buffer,
            nil,
            &audioBufferList,
            MemoryLayout<AudioBufferList>.size,
            nil,
            nil,
            kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
            &blockBuffer
        );

        let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))

        for buffer in buffers {

            let samplesCount = Int(buffer.mDataByteSize) / MemoryLayout<Int16>.size
            let samplesPointer = audioBufferList.mBuffers.mData!.bindMemory(to: Int16.self, capacity: samplesCount)
            let samples = UnsafeMutableBufferPointer<Int16>(start: samplesPointer, count: samplesCount)

            for sample in samples {

                //do something with you sample (which is Int16 amplitude value)

            }
        }
    }
}

答案 2 :(得分:1)

马丁的回答是有效的,并且正是我在问题中提出的问题,然而,在发布问题并花费更多时间解决问题之后(在看到马丁的回答之前),我想出了这个:

public func captureOutput(
    captureOutput: AVCaptureOutput!,
    didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
    fromConnection connection: AVCaptureConnection!
) {
    let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer)
    self.currentZ = Double(samplesInBuffer)

    let buffer: CMBlockBufferRef = CMSampleBufferGetDataBuffer(sampleBuffer)

    var lengthAtOffset: size_t = 0
    var totalLength: size_t = 0
    var data: UnsafeMutablePointer<Int8> = nil

    if( CMBlockBufferGetDataPointer( buffer, 0, &lengthAtOffset, &totalLength, &data ) != noErr ) {
        println("some sort of error happened")
    } else {
        for i in stride(from: 0, to: totalLength, by: 2) {
            // do stuff
        }
    }
}

这是一种稍微不同的方法,可能仍有改进的余地,但这里的要点是至少在iPad Mini(可能还有其他设备)上,每次调用此方法时,我们都会得到1,024个样本。但这些样本的数量为2,048 Int8。每隔一个是需要合并的左/右字节,以使Int16将2,048个半样本转换为1,024个整个样本。

答案 3 :(得分:0)

它对我有用。试试吧:

let musicUrl: NSURL = mediaItemCollection.items[0].valueForProperty(MPMediaItemPropertyAssetURL) as! NSURL
let asset: AVURLAsset = AVURLAsset(URL: musicUrl, options: nil)
let assetOutput = AVAssetReaderTrackOutput(track: asset.tracks[0] as! AVAssetTrack, outputSettings: nil)

var error : NSError?

let assetReader: AVAssetReader = AVAssetReader(asset: asset, error: &error)

if error != nil {
    print("Error asset Reader: \(error?.localizedDescription)")
}

assetReader.addOutput(assetOutput)
assetReader.startReading()

let sampleBuffer: CMSampleBufferRef = assetOutput.copyNextSampleBuffer()

var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var blockBuffer: Unmanaged<CMBlockBuffer>? = nil


CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
    sampleBuffer,
    nil,
    &audioBufferList,
    sizeof(audioBufferList.dynamicType), // instead of UInt(sizeof(audioBufferList.dynamicType))
    nil,
    nil,
    UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
    &blockBuffer
)

答案 4 :(得分:0)

此处发布的答案对必要AudioBufferList的大小进行了假设-可能使它们在特定情况下可以工作,但从AVCaptureSession接收音频时对我不起作用。 (苹果自己的sample code也不起作用。)

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer上的文档并不明显,但事实证明您可以向该函数询问AudioListBuffer项应 多大,然后调用再次分配了AudioBufferList所需的大小。

下面是一个C ++示例(对不起,不知道Swift),它显示了一个更通用的解决方案对我有用。

// ask the function how big the audio buffer list should be for this
// sample buffer ref
size_t requiredABLSize = 0;
err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
                      &requiredABLSize,
                      NULL,
                      NULL,
                      kCFAllocatorSystemDefault,
                      kCFAllocatorSystemDefault,
                      kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                      NULL);

// allocate an audio buffer list of the required size
AudioBufferList* audioBufferList = (AudioBufferList*) malloc(requiredABLSize);
// ensure that blockBuffer is NULL in case the function fails
CMBlockBufferRef blockBuffer = NULL;

// now let the function allocate fill in the ABL for you
err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
                      NULL,
                      audioBufferList,
                      requiredABLSize,
                      kCFAllocatorSystemDefault,
                      kCFAllocatorSystemDefault,
                      kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                      &blockBuffer);

// if we succeeded...
if (err == noErr) {
   // la la la... read your samples...
}

// release the allocated block buffer
if (blockBuffer != NULL) {
    CFRelease(blockBuffer);
    blockBuffer = NULL;
}

// release the allocated ABL
if (audioBufferList != NULL) {
    free(audioBufferList);
    audioBufferList = NULL;
}

我将由Swift专家来提供该语言的实现。

答案 5 :(得分:0)

我这样做(Swift 4.2):

let n = CMSampleBufferGetNumSamples(audioBuffer)
let format = CMSampleBufferGetFormatDescription(audioBuffer)!
let asbd = CMAudioFormatDescriptionGetStreamBasicDescription(format)!.pointee

let nChannels = Int(asbd.mChannelsPerFrame) // probably 2
let bufferlistSize = AudioBufferList.sizeInBytes(maximumBuffers: nChannels)
let abl = AudioBufferList.allocate(maximumBuffers: nChannels)
for i in 0..<nChannels {
    abl[i] = AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil)
}

var block: CMBlockBuffer?
var status = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioBuffer, bufferListSizeNeededOut: nil, bufferListOut: abl.unsafeMutablePointer, bufferListSize: bufferlistSize, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &block)
assert(noErr == status)

// use AudioBufferList here (abl.unsafePointer), e.g. with ExtAudioFileWrite or what have you