我需要根据是否插入耳机来改变我的音频。我知道kAudioSessionProperty_AudioInputAvailable,它会告诉我是否有麦克风,但我想测试任何耳机,而不仅仅是耳机内置麦克风。这可能吗?
答案 0 :(得分:28)
以下是我自己的一种方法,它是本网站上的一个略微修改过的版本:http://www.iphonedevsdk.com/forum/iphone-sdk-development/9982-play-record-same-time.html
- (BOOL)isHeadsetPluggedIn {
UInt32 routeSize = sizeof (CFStringRef);
CFStringRef route;
OSStatus error = AudioSessionGetProperty (kAudioSessionProperty_AudioRoute,
&routeSize,
&route);
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
if (!error && (route != NULL)) {
NSString* routeStr = (NSString*)route;
NSRange headphoneRange = [routeStr rangeOfString : @"Head"];
if (headphoneRange.location != NSNotFound) return YES;
}
return NO;
}
答案 1 :(得分:12)
这是基于rob mayoff评论的解决方案:
- (BOOL)isHeadsetPluggedIn
{
AVAudioSessionRouteDescription *route = [[AVAudioSession sharedInstance] currentRoute];
BOOL headphonesLocated = NO;
for( AVAudioSessionPortDescription *portDescription in route.outputs )
{
headphonesLocated |= ( [portDescription.portType isEqualToString:AVAudioSessionPortHeadphones] );
}
return headphonesLocated;
}
只需链接到AVFoundation框架。
答案 2 :(得分:3)
对于这篇文章的任何未来读者来说都是个好消息。
大多数AVToolbox方法在iOS 7发布时已被弃用,没有替代方案,因此音频监听器现在主要是冗余
答案 3 :(得分:1)
我开始使用jpsetung上面给出的代码,但是我的用例有一些问题:
kAudioSessionProperty_AudioRoute
的证据
route
此实现扩展了检查以允许任何类型的指定输出:
BOOL isAudioRouteAvailable(CFStringRef routeType)
{
/*
As of iOS 5:
kAudioSessionOutputRoute_LineOut;
kAudioSessionOutputRoute_Headphones;
kAudioSessionOutputRoute_BluetoothHFP;
kAudioSessionOutputRoute_BluetoothA2DP;
kAudioSessionOutputRoute_BuiltInReceiver;
kAudioSessionOutputRoute_BuiltInSpeaker;
kAudioSessionOutputRoute_USBAudio;
kAudioSessionOutputRoute_HDMI;
kAudioSessionOutputRoute_AirPlay;
*/
//Prep
BOOL foundRoute = NO;
CFDictionaryRef description = NULL;
//Session
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
AudioSessionInitialize(NULL, NULL, NULL, NULL);
});
//Property
UInt32 propertySize;
AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &propertySize);
OSStatus error = AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &propertySize, &description);
if ( !error && description ) {
CFArrayRef outputs = CFDictionaryGetValue(description, kAudioSession_AudioRouteKey_Outputs);
CFIndex count = CFArrayGetCount(outputs);
if ( outputs && count ) {
for (CFIndex i = 0; i < count; i++) {
CFDictionaryRef route = CFArrayGetValueAtIndex(outputs, i);
CFStringRef type = CFDictionaryGetValue(route, kAudioSession_AudioRouteKey_Type);
NSLog(@"Got audio route %@", type);
//Audio route type
if ( CFStringCompare(type, routeType, 0) == kCFCompareEqualTo ) {
foundRoute = YES;
break;
}
}
}
} else if ( error ) {
NSLog(@"Audio route error %ld", error);
}
//Cleanup
if ( description ) {
CFRelease(description);
}
//Done
return foundRoute;
}
像这样使用:
if ( isAudioRouteAvailable(kAudioSessionOutputRoute_BuiltInSpeaker) ) {
//Do great things...
}