我正在尝试实现一些我认为会非常容易的事情,但事实并非如此。
我正在使用“ react-native-audio”库的源代码。但是您可以假设出于这个问题,我在本地工作。
这是我正在玩的reference for the source code。
我的目标很简单,我正在使用AVAudioRecorder
来录制会议(大约需要30分钟)。如果在录制过程中有来电,我希望我的应用程序能够通过执行以下任一选项来“恢复”:
1)当应用程序返回到前台时,“暂停”“来电”和“继续”记录。
2)接听电话-关闭当前文件,当应用程序返回到前台时,使用新文件开始新的录音(第2部分)。
显然,首选(1)。
请注意,我很清楚使用AVAudioSessionInterruptionNotification
,并且到目前为止在运气不好的实验中都使用了它,例如:
- (void) receiveAudioSessionNotification:(NSNotification *) notification
{
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
NSLog(@"AVAudioSessionInterruptionNotification");
NSNumber *type = [notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey];
if ([type isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
NSLog(@"*** InterruptionTypeBegan");
[self pauseRecording];
} else {
NSLog(@"*** InterruptionTypeEnded");
[_recordSession setActive:YES error:nil];
}
}
}
请注意,我将为这个问题设立赏金,但是唯一可以接受的答案是针对现实世界中的工作代码,而不是“理论上应该工作”的代码。非常感谢您的帮助:)
答案 0 :(得分:2)
我选择AVAudioEngine
和AVAudioFile
作为解决方案,因为代码很简短,而AVFoundation的中断处理是particularly simple(您的播放器/记录器对象已暂停,并且取消暂停会重新激活您的音频会话)
NB AVAudioFile
没有显式的close方法,而是在dealloc
期间编写标题并关闭文件,这一选择令人遗憾地使原本简单的API复杂化
@interface ViewController ()
@property (nonatomic) AVAudioEngine *audioEngine;
@property AVAudioFile *outputFile;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error;
if (![session setCategory:AVAudioSessionCategoryRecord error:&error]) {
NSLog(@"Failed to set session category: %@", error);
}
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioInterruptionHandler:) name:AVAudioSessionInterruptionNotification object:nil];
NSURL *outputURL = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask][0] URLByAppendingPathComponent:@"output.aac"];
__block BOOL outputFileInited = NO;
self.audioEngine = [[AVAudioEngine alloc] init];
AVAudioInputNode *inputNode = self.audioEngine.inputNode;
[inputNode installTapOnBus:0 bufferSize:512 format:nil block:^(AVAudioPCMBuffer *buffer, AVAudioTime * when) {
NSError *error;
if (self.outputFile == nil && !outputFileInited) {
NSDictionary *settings = @{
AVFormatIDKey: @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey: @(buffer.format.channelCount),
AVSampleRateKey: @(buffer.format.sampleRate)
};
self.outputFile = [[AVAudioFile alloc] initForWriting:outputURL settings:settings error:&error];
if (!self.outputFile) {
NSLog(@"output file error: %@", error);
abort();
}
outputFileInited = YES;
}
if (self.outputFile && ![self.outputFile writeFromBuffer:buffer error:&error]) {
NSLog(@"AVAudioFile write error: %@", error);
}
}];
if (![self.audioEngine startAndReturnError:&error]) {
NSLog(@"engine start error: %@", error);
}
// To stop recording, nil the outputFile at some point in the future.
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(20 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
NSLog(@"Finished");
self.outputFile = nil;
});
}
// https://developer.apple.com/library/archive/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/HandlingAudioInterruptions/HandlingAudioInterruptions.html
- (void)audioInterruptionHandler:(NSNotification *)notification {
NSDictionary *info = notification.userInfo;
AVAudioSessionInterruptionType type = [info[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
switch (type) {
case AVAudioSessionInterruptionTypeBegan:
NSLog(@"Begin interruption");
break;
case AVAudioSessionInterruptionTypeEnded:
NSLog(@"End interruption");
// or ignore shouldResume if you're really keen to resume recording
AVAudioSessionInterruptionOptions endOptions = [info[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
if (AVAudioSessionInterruptionOptionShouldResume == endOptions) {
NSError *error;
if (![self.audioEngine startAndReturnError:&error]) {
NSLog(@"Error restarting engine: %@", error);
}
}
break;
}
}
@end
您可能要启用背景音频(并在Info.plist中添加一个NSMicrophoneUsageDescription
字符串)。