点击应用内部音频录制期间

时间:2016-07-20 08:13:34

标签: ios objective-c audio avfoundation core-audio

我一直试图在iOS 9上的应用程序间音频会话期间录制我的输入。扬声器输出听起来不错,但录制的文件有节奏的咔嗒声。 波形如下所示......

enter image description here

我已经调整了我能想到的每个设置和参数,似乎没有任何效果。

以下是格式设置(流设置相同)...

    AudioStreamBasicDescription fileFormat;
fileFormat.mSampleRate          = kSessionSampleRate;
fileFormat.mFormatID            = kAudioFormatLinearPCM;
fileFormat.mFormatFlags         = kAudioFormatFlagsNativeFloatPacked;
fileFormat.mFramesPerPacket     = 1;
fileFormat.mChannelsPerFrame    = 1;
fileFormat.mBitsPerChannel      = 32;       //tone is correct but there is still pops
fileFormat.mBytesPerPacket      = sizeof(Float32);
fileFormat.mBytesPerFrame       = sizeof(Float32);

以下是流设置...

        //connect instrument to output
AudioComponentDescription componentDescription = unit.componentDescription;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &componentDescription);
OSStatus status = AudioComponentInstanceNew(inputComponent, &_instrumentUnit);
NSLog(@"%d",status);
AudioUnitElement instrumentOutputBus = 0;
AudioUnitElement ioUnitInputElement = 0;

    //connect instrument unit to remoteIO output's input bus
AudioUnitConnection connection;
connection.sourceAudioUnit = _instrumentUnit;
connection.sourceOutputNumber = instrumentOutputBus;
connection.destInputNumber = ioUnitInputElement;
status = AudioUnitSetProperty(_ioUnit,
                              kAudioUnitProperty_MakeConnection,
                              kAudioUnitScope_Output,
                              ioUnitInputElement,
                              &connection,
                              sizeof(connection));
NSLog(@"%d",status);
UInt32 maxFrames = 1024; //I tried setting this to 4096 but it did not help
status = AudioUnitSetProperty(_instrumentUnit,
                                kAudioUnitProperty_MaximumFramesPerSlice,
                                kAudioUnitScope_Output,
                                0,
                                &maxFrames,
                                sizeof(maxFrames));
NSLog(@"%d",status);



_connectedInstrument = YES;
_instrumentIconImageView.image = unit.icon;
NSLog(@"Remote Instrument connected");
status = AudioUnitInitialize(_ioUnit);
NSLog(@"%d",status);
status = AudioOutputUnitStart(_ioUnit);
NSLog(@"%d",status);
status = AudioUnitInitialize(_instrumentUnit);
NSLog(@"%d",status);
[self setupFile];

这是我的回调......

static OSStatus recordingCallback(void                              *inRefCon,
                                 AudioUnitRenderActionFlags         *ioActionFlags,
                                 const AudioTimeStamp               *inTimeStamp,
                                 UInt32                             inBusNumber,
                                 UInt32                             inNumberFrames,
                                 AudioBufferList                    *ioData)
{
    ViewController* This = This = (__bridge ViewController *)inRefCon;
    if (inBusNumber == 0 && !(*ioActionFlags & kAudioUnitRenderAction_PostRenderError))
    {
            ExtAudioFileWriteAsync(This->fileRef, inNumberFrames, ioData);
    }

    return noErr;
}

Full view controller code here

感谢您的帮助。

2 个答案:

答案 0 :(得分:5)

您正在写入文件前后渲染。在渲染回调中,将if语句更改为仅在后渲染时写入。

if (inBusNumber == 0 && *ioActionFlags == kAudioUnitRenderAction_PostRender){
    ExtAudioFileWriteAsync(This->fileRef, inNumberFrames, ioData);
} 

ExtAudioFileWriteAsync执行一些内部复制和缓冲,因此只要在第一次写入之前填充它就可以在渲染回调中使用。

答案 1 :(得分:1)

您很可能需要检查两者

  • 渲染后动作标志
  • 发布渲染错误

回调的关键部分可能看起来有点像这样:

if (*ioActionFlags & kAudioUnitRenderAction_PostRender){
    static int TEMP_kAudioUnitRenderAction_PostRenderError = (1 << 8);
    if (!(*ioActionFlags & TEMP_kAudioUnitRenderAction_PostRenderError))
    { 
         ExtAudioFileWriteAsync(This->fileRef, inNumberFrames, ioData);
         //whichever additional code needed
         // { … }
    }