将远程I / 0输出写入和编码到文件 - Core Audio

时间:2012-05-07 11:08:31

标签: iphone core-audio

我正在使用音频单元对音乐录制应用进行编码,而且我遇到了一些问题,导致我的M4A文件播放除了不那么棒的嗡嗡声之外的任何其他内容。我已经使用了这些SO sources as references,我已经尝试了一切来解决问题。

我有一个AUGraph有2个节点:一个多通道混音器和一个远程I / O.我的调音台上有两个输入回调:一个用于从麦克风输入输入,另一个用于从音频文件中提取。混频器输出连接到I / O单元上输出范围的输入元件。这样可以同时进行I / O.

为了捕获输出,我添加了一个回调和两个方法:

回调

static OSStatus recordAndSaveCallback (void *                inRefCon,
                                       AudioUnitRenderActionFlags * ioActionFlags,
                                       const AudioTimeStamp *       inTimeStamp,
                                       UInt32                       inBusNumber,
                                       UInt32                       inNumberFrames,
                                       AudioBufferList *            ioData) 
{
    Mixer* THIS = (__bridge Mixer*)inRefCon;
    AudioBufferList bufferList;

    OSStatus status;
    status = AudioUnitRender(THIS.ioUnit,    
                             ioActionFlags,
                             inTimeStamp,
                             0,
                             inNumberFrames,
                             &bufferList);

    SInt16 samples[inNumberFrames]; // A large enough size to not have to worry about buffer overrun
    memset (&samples, 0, sizeof (samples));

    bufferList.mNumberBuffers = 1;
    bufferList.mBuffers[0].mData = samples;
    bufferList.mBuffers[0].mNumberChannels = 1;
    bufferList.mBuffers[0].mDataByteSize = inNumberFrames*sizeof(SInt16);

    OSStatus result;
    if (*ioActionFlags == kAudioUnitRenderAction_PostRender) {
        result =  ExtAudioFileWriteAsync(THIS.extAudioFileRef, inNumberFrames, &bufferList);
        if(result) printf("ExtAudioFileWriteAsync %ld \n", result);}
    return noErr; 
}

录制方法:

- (void)recordFile
{    
    OSStatus result;

    NSArray  *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *recordFile = [documentsDirectory stringByAppendingPathComponent: @"audio.m4a"];

    CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, 
                                                            (__bridge   CFStringRef)recordFile, 
                                                            kCFURLPOSIXPathStyle, 
                                                            false);    

    AudioStreamBasicDescription destinationFormat;
    memset(&destinationFormat, 0, sizeof(destinationFormat));
    destinationFormat.mChannelsPerFrame = 1;
    destinationFormat.mFormatID = kAudioFormatMPEG4AAC;
    UInt32 size = sizeof(destinationFormat);
    result = AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &size, &destinationFormat);        
    if(result) printf("AudioFormatGetProperty %ld \n", result);    


    result = ExtAudioFileCreateWithURL(destinationURL, 
                                       kAudioFileM4AType, 
                                       &destinationFormat, 
                                       NULL, 
                                       kAudioFileFlags_EraseFile, 
                                       &extAudioFileRef);
    if(result) printf("ExtAudioFileCreateWithURL %ld \n", result);


    AudioStreamBasicDescription clientFormat;
    memset(&clientFormat, 0, sizeof(clientFormat));


    UInt32 clientsize = sizeof(clientFormat);   
    result = AudioUnitGetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &clientFormat, &clientsize);
    if(result) printf("AudioUnitGetProperty %ld \n", result);

    UInt32 codec = kAppleHardwareAudioCodecManufacturer;

    result = ExtAudioFileSetProperty(extAudioFileRef, 
                                     kExtAudioFileProperty_CodecManufacturer, 
                                     sizeof(codec), 
                                     &codec);

    if(result) printf("ExtAudioFileSetProperty %ld \n", result);


    result = ExtAudioFileSetProperty(extAudioFileRef,kExtAudioFileProperty_ClientDataFormat,sizeof(clientFormat), &clientFormat);
    if(result) printf("ExtAudioFileSetProperty %ld \n", result);


    result =  ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL);
    if (result) {[self printErrorMessage: @"ExtAudioFileWriteAsync error" withStatus: result];}

   result = AudioUnitAddRenderNotify(ioUnit, recordAndSaveCallback, (__bridge void*)self);
    if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result];}     
}

保存方法:

- (void) saveFile {
    OSStatus status = ExtAudioFileDispose(extAudioFileRef);
    NSLog(@"OSStatus(ExtAudioFileDispose): %ld\n", status);

}

这是我在控制台中看到的内容:

Stopping audio processing graph
OSStatus(ExtAudioFileDispose): 0
ExtAudioFileWriteAsync -50 
ExtAudioFileWriteAsync -50 
ExtAudioFileWriteAsync -50 

在我看来,我的代码与那些已经开始工作的人的代码非常相似,但显然我犯了一个至关重要的错误。我确信必须有其他人在努力解决这个问题。

有没有人有任何见解?

感谢。

1 个答案:

答案 0 :(得分:1)

我知道很久以前就已经问过这个问题了,你现在可能已经发现了错误,我只是回答其他可能有同样问题的错误。

我可能错了,但我认为问题来自于您正在为缓冲区进行范围内变量声明。

我建议你改变

SInt16 samples[inNumberFrames];

进入

SInt16* samples = malloc(inNumberFrames * sizeof(SInt16));

由于recordAndSaveCallback用于填充缓冲区列表,如果进行范围内声明,则一旦范围结束,数据将被销毁。