所以我在这里根据一些帖子拼凑了一些记录音频的例程。我引用的帖子是here和here,以及阅读他们引用的网站。
我的设置:我有一个现有的AUGraph :(几个AUSamplers - > Mixer - > RemoteIO)。 AUSamplers连接到MusicPlayer实例中的曲目。一切正常但是 我想添加录音。
录音工作正常但是得到的.caf音高/速度变慢+音质不好。我指定的格式一定有问题吗?
有人可以注意这一点并告诉我我在哪里设置错误的格式吗?
编辑:这可能是立体声/单声道问题吗?我的意思是用单声道录音。
我将RemoteIO实例上的流格式设置为:
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate = 44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
// Apply format
result = AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus,
&audioFormat,
sizeof(audioFormat));
然后从按钮操作中创建一个fileRef并将renderCallback附加到RemoteIO实例:
- (void)startRecording
{
OSStatus result;
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate = 44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *destinationFilePath = [[NSString alloc] initWithFormat: @"%@/output.caf", documentsDirectory];
CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault,
(__bridge CFStringRef)destinationFilePath,
kCFURLPOSIXPathStyle,
false);
result = ExtAudioFileCreateWithURL(destinationURL,
kAudioFileWAVEType,
&audioFormat,
NULL,
kAudioFileFlags_EraseFile,
&extAudioFileRef);
CFRelease(destinationURL);
NSAssert(result == noErr, @"Couldn't create file for writing");
result = ExtAudioFileSetProperty(extAudioFileRef,
kExtAudioFileProperty_ClientDataFormat,
sizeof(AudioStreamBasicDescription),
&audioFormat);
NSAssert(result == noErr, @"Couldn't create file for format");
result = ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL);
NSAssert(result == noErr, @"Couldn't initialize write buffers for audio file");
printf("Adding render to remoteIO \n");
result = AudioUnitAddRenderNotify(ioUnit, renderCallback, (__bridge void*)self);
if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result]; return;}
}
最后在我的rendercallback中,我在postRender阶段写出了数据:
static OSStatus renderCallback (void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
OSStatus result;
if (*ioActionFlags == kAudioUnitRenderAction_PostRender){
double timeInSeconds = inTimeStamp->mSampleTime / kSampleRate;
printf("%fs inBusNumber: %lu inNumberFrames: %lu \n", timeInSeconds, inBusNumber, inNumberFrames);
MusicPlayerController* THIS = (__bridge MusicPlayerController *)inRefCon;
result = ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, ioData);
if(result) printf("ExtAudioFileWriteAsync %ld \n", result);
}
return noErr;
}
答案 0 :(得分:3)
好的 - 找到了解决这个问题的一些代码 - 虽然我不完全理解为什么。
我将RemoteIO输出流和ExtFileRef的mBitsPerChannel
设置为16。结果放慢了速度。沙哑的音频。将ExtFileRef mBitsPerChannel
设置为32加上添加kAudioFormatFlagsNativeEndian
标志可以解决问题:.caf音频非常完美(将RemoteIO输出流设置保留原样)。
但是,设置RemoteIO输出流设置以匹配我的新设置也可以。所以我很困惑。只要AudioStreamBasicDescription
设置对于RemoteIO实例和ExtFileRef是对称的,这不应该工作吗?
无论如何......工作设置如下。
size_t bytesPerSample = sizeof (AudioUnitSampleType);
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate= graphSampleRate;
audioFormat.mFormatID=kAudioFormatLinearPCM;
audioFormat.mFormatFlags=kAudioFormatFlagsNativeEndian|kAudioFormatFlagIsSignedInteger|kAudioFormatFlagIsPacked;
audioFormat.mBytesPerPacket=bytesPerSample;
audioFormat.mBytesPerFrame=bytesPerSample;
audioFormat.mFramesPerPacket=1;
audioFormat.mChannelsPerFrame=1;
audioFormat.mBitsPerChannel= 8 * bytesPerSample;
audioFormat.mReserved=0;