我正在尝试编写一个可以进行数字信号处理的应用程序,并希望尽可能轻松。让我困惑一段时间的一件事就是各种设备的默认值可能是什么,这样我就可以避免在从缓冲区接收数据之前发生不需要的转换。我发现了以下链接http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios,它让我了解了我认为正确的道路。
我在获取ASBD(AudioStreamBasicDescription)内容之前已经从链接扩展了代码以创建和激活AVAudioSession,然后可以使用AudioSession来请求各种" Preferred"设置,看看他们有什么影响。我还结合了Apple代码,用于列出ASBD的值与上述链接中的代码。
下面的代码放入通过选择Single View Application模板生成的ViewController.m文件中。请注意,您需要将AudioToolbox.framework和CoreAudio.framework添加到项目的Linked Frameworks和Libraries中。
#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;
@interface ViewController ()
@end
@implementation ViewController
- (void) printASBD:(AudioStreamBasicDescription) asbd {
char formatIDString[5];
UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
bcopy (&formatID, formatIDString, 4);
formatIDString[4] = '\0';
NSLog (@" Sample Rate: %10.0f", asbd.mSampleRate);
NSLog (@" Format ID: %10s", formatIDString);
NSLog (@" Format Flags: %10X", (unsigned int)asbd.mFormatFlags);
NSLog (@" Bytes per Packet: %10d", (unsigned int)asbd.mBytesPerPacket);
NSLog (@" Frames per Packet: %10d", (unsigned int)asbd.mFramesPerPacket);
NSLog (@" Bytes per Frame: %10d", (unsigned int)asbd.mBytesPerFrame);
NSLog (@" Channels per Frame: %10d", (unsigned int)asbd.mChannelsPerFrame);
NSLog (@" Bits per Channel: %10d", (unsigned int)asbd.mBitsPerChannel);
}
- (void)viewDidLoad
{
[super viewDidLoad];
NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
// Get a reference to the AudioSession and activate it
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[audioSession setActive:YES error:&error];
// Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
AudioUnit remoteIOUnit;
AudioComponentDescription audioComponentDesc = {0};
audioComponentDesc.componentType = kAudioUnitType_Output;
audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
// Get component
AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
AudioComponentInstanceNew(audioComponent, &remoteIOUnit);
// Read the stream format
size_t asbdSize = sizeof(AudioStreamBasicDescription);
AudioStreamBasicDescription asbd = {0};
AudioUnitGetProperty(remoteIOUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
0,
(void *)&asbd,
&asbdSize);
[self printASBD:asbd];
}
@end
我很想知道人们为其他实际硬件获得的结果。请注意,代码已构建并部署到IOS 7.1
答案 0 :(得分:0)
格式标志是:
kAudioFormatFlagIsFloat = (1 << 0), // 0x1
kAudioFormatFlagIsBigEndian = (1 << 1), // 0x2
kAudioFormatFlagIsSignedInteger = (1 << 2), // 0x4
kAudioFormatFlagIsPacked = (1 << 3), // 0x8
kAudioFormatFlagIsAlignedHigh = (1 << 4), // 0x10
kAudioFormatFlagIsNonInterleaved = (1 << 5), // 0x20
kAudioFormatFlagIsNonMixable = (1 << 6), // 0x40
kAudioFormatFlagsAreAllClear = (1 << 31),
我为iPad 4获得的结果如下:
Sample Rate: 0
Format ID: lpcm
Format Flags: 29
Bytes per Packet: 4
Frames per Packet: 1
Bytes per Frame: 4
Channels per Frame: 2
Bits per Channel: 32
我猜lpcm(线性脉冲编码调制)并不奇怪,格式标志= x&#39; 29&#39; kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked
以及每个通道32位似乎表示预期的8.24&#34;固定浮动&#34;。