AVAudioRecorder仅在中断后记录音频

时间:2012-01-20 10:20:51

标签: iphone objective-c avfoundation avaudiorecorder avaudiosession

在我使用AVAudioRecorder和AVAudioPlayer录制和播放音频的应用程序中,我遇到了来电的情况。虽然录音正在进行中并且如果来电,但是通话后录制的音频只是录音。我想在通话后录制的录音是电话前录音的延续。

我使用AVAudioRecorderDelegate方法跟踪录音机中发生的中断

  • (void)audioRecorderBeginInterruption:(AVAudioRecorder *)avRecorder 和
  • (void)audioRecorderEndInterruption:(AVAudioRecorder *)avRecorder。

在我的EndInterruption方法中,我激活了audioSession。

以下是我使用的录制代码

- (void)startRecordingProcess
{
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    NSError *err = nil;
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    [audioSession setActive:YES error:&err];
    err = nil;
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    // Record settings for recording the audio
    recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
                     [NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
                     [NSNumber numberWithInt:44100],AVSampleRateKey,
                     [NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
                     [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                     nil];
    BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath];
    if (fileExists) 
    {        
        BOOL appendingFileExists = 
            [[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath];
        if (appendingFileExists)
        {
            [[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil];
        }
        if (appendingFilePath) 
        {
            [appendingFilePath release];
            appendingFilePath = nil;
        }
        appendingFilePath = [[NSString alloc]initWithFormat:@"%@/AppendedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:appendingFilePath]; 
    }
    else 
    {
        isFirstTime = YES;
        if (recorderFilePath) 
        {
            DEBUG_LOG(@"Testing 2");
            [recorderFilePath release];
            recorderFilePath = nil;
        }
        DEBUG_LOG(@"Testing 3");
        recorderFilePath = [[NSString alloc]initWithFormat:@"%@/RecordedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:recorderFilePath];
    }
    err = nil;
    recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain];
    if(!recorder)
    {
        DEBUG_LOG(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        [[AlertFunctions sharedInstance] showMessageWithTitle:kAppName 
                                                      message:[err localizedDescription] 
                                                     delegate:nil
                                            cancelButtonTitle:@"Ok"];
        return;
    }
    //prepare to record
    [recorder setDelegate:self];
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];

}

在搜索此问题的解决方案时,我遇到了另一个链接  提到相同问题的how to resume recording after interruption occured in iphone?http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html。 我尝试了这些链接中提供的建议,但没有成功。 我希望它能与AVAudioRecorder本身一起使用。 有什么方法可以找到解决这个问题的方法吗? 所有宝贵的建议都表示赞赏。

2 个答案:

答案 0 :(得分:4)

经过多次研究后,Apple通知我这是当前API的一个问题。所以我设法通过在中断之后保存先前的音频文件并将其与恢复的音频文件连接来找到该问题的解决方法。希望它可以帮助那些可能面临同样问题的人。

答案 1 :(得分:3)

我也遇到了一个类似的问题,AVAudioRecorder只在中断后才录制 所以我通过维护一系列录音并将它们保存在NSTemporaryDirectory中并最终在最后合并它来解决这个问题。

以下是关键步骤:

  1. 让您的班级听取AVAudioSessionInterruptionNotification
  2. 中断开始(AVAudioSessionInterruptionTypeBegan),保存录制
  3. 在中断结束时(AVAudioSessionInterruptionTypeEnded),启动新的录制中断选项AVAudioSessionInterruptionOptionShouldResume
  4. 在点击保存按钮时附加所有录音。
  5. 上述步骤的代码段是:

    // 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad
    - (void)viewDidLoad
    {
        [super viewDidLoad];
    
        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(handleAudioSessionInterruption:)
                                                     name:AVAudioSessionInterruptionNotification
                                                   object:[AVAudioSession sharedInstance]];
    
        // other coding stuff
    }
    
    // observe the interruption begin / end 
    - (void)handleAudioSessionInterruption:(NSNotification*)notification
    {
        AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
        AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
    
        switch (interruptionType) {
            // 2. save recording on interruption begin
            case AVAudioSessionInterruptionTypeBegan:{
                // stop recording
                // Update the UI accordingly
                break;
            }
            case AVAudioSessionInterruptionTypeEnded:{
                if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                    // create a new recording
                    // Update the UI accordingly
                }
                break;
            }
    
            default:
                break;
        }
    }  
    
    // 4. append all recordings
    - (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
    {
        // append all recordings one after other
    }
    

    这是一个有效的例子:

    //
    //  XDRecordViewController.m
    //
    //  Created by S1LENT WARRIOR
    //
    
    #import "XDRecordViewController.h"
    
    @interface XDRecordViewController ()
    {
        AVAudioRecorder *recorder;
    
        __weak IBOutlet UIButton* btnRecord;
        __weak IBOutlet UIButton* btnSave;
        __weak IBOutlet UIButton* btnDiscard;
        __weak IBOutlet UILabel*  lblTimer; // a UILabel to display the recording time
    
        // some variables to display the timer on a lblTimer
        NSTimer* timer;
        NSTimeInterval intervalTimeElapsed;
        NSDate* pauseStart;
        NSDate* previousFireDate;
        NSDate* recordingStartDate;
    
        // interruption handling variables
        BOOL isInterrupted;
        NSInteger preInterruptionDuration;
    
        NSMutableArray* recordings; // an array of recordings to be merged in the end
    }
    @end
    
    @implementation XDRecordViewController
    
    - (void)viewDidLoad
    {
        [super viewDidLoad];
    
        // Make this class listen to the AVAudioSessionInterruptionNotification
        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(handleAudioSessionInterruption:)
                                                     name:AVAudioSessionInterruptionNotification
                                                   object:[AVAudioSession sharedInstance]];
    
        [self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory()
    
        recordings = [NSMutableArray new]; // initialize recordings
    
        [self setupAudioSession]; // setup the audio session. you may customize it according to your requirements
    }
    
    - (void)viewDidAppear:(BOOL)animated
    {
        [super viewDidAppear:animated];
    
        [self initRecording];   // start recording as soon as the view appears
    }
    
    - (void)dealloc
    {
        [self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory
    
        [[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter
    }
    
    #pragma mark - Event Listeners
    
    // called when recording button is tapped
    - (IBAction) btnRecordingTapped:(UIButton*)sender
    {
        sender.selected = !sender.selected; // toggle the button
    
        if (sender.selected) { // resume recording
            [recorder record];
            [self resumeTimer];
        } else { // pause recording
            [recorder pause];
            [self pauseTimer];
        }
    }
    
    // called when save button is tapped
    - (IBAction) btnSaveTapped:(UIButton*)sender
    {
        [self pauseTimer]; // pause the timer
    
        // disable the UI while the recording is saving so that user may not press the save, record or discard button again
        btnSave.enabled = NO;
        btnRecord.enabled = NO;
        btnDiscard.enabled = NO;
    
        [recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called
    
        // Deactivate the AVAudioSession
        NSError* error;
        [[AVAudioSession sharedInstance] setActive:NO error:&error];
        if (error) {
            NSLog(@"%@", error);
        }
    }
    
    // called when discard button is tapped
    - (IBAction) btnDiscardTapped:(id)sender
    {
        [self stopTimer]; // stop the timer
    
        recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called
        [recorder stop];  // stop the recorder
    
        // Deactivate the AVAudioSession
        NSError* error;
        [[AVAudioSession sharedInstance] setActive:NO error:&error];
        if (error) {
            NSLog(@"%@", error);
        }
    
        [self.navigationController popViewControllerAnimated:YES];
    }
    
    #pragma mark - Notification Listeners
    // called when an AVAudioSessionInterruption occurs
    - (void)handleAudioSessionInterruption:(NSNotification*)notification
    {
        AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
        AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
    
        switch (interruptionType) {
            case AVAudioSessionInterruptionTypeBegan:{
                // • Recording has stopped, already inactive
                // • Change state of UI, etc., to reflect non-recording state
                preInterruptionDuration += recorder.currentTime; // time elapsed
                if(btnRecord.selected) {    // timer is already running
                    [self btnRecordingTapped:btnRecord];  // pause the recording and pause the timer
                }
    
                recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called
                [recorder stop];    // stop recording
                isInterrupted = YES;
                break;
            }
            case AVAudioSessionInterruptionTypeEnded:{
                // • Make session active
                // • Update user interface
                // • AVAudioSessionInterruptionOptionShouldResume option
                if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                    // Here you should create a new recording
                    [self initRecording];   // create a new recording
                    [self btnRecordingTapped:btnRecord];
                }
                break;
            }
    
            default:
                break;
        }
    }
    
    #pragma mark - AVAudioRecorderDelegate
    - (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
    {
        [self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) {
            // do whatever you want with the new audio file :)
        }];
    }
    
    #pragma mark - Timer
    - (void)timerFired:(NSTimer*)timer
    {
        intervalTimeElapsed++;
        [self updateDisplay];
    }
    
    // function to time string
    - (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval
    {
        NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
        NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
        [dateFormatter setDateFormat:@"mm:ss"];
        [dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
        return [dateFormatter stringFromDate:timerDate];
    }
    
    // called when recording pauses
    - (void) pauseTimer
    {
        pauseStart = [NSDate dateWithTimeIntervalSinceNow:0];
    
        previousFireDate = [timer fireDate];
    
        [timer setFireDate:[NSDate distantFuture]];
    }
    
    - (void) resumeTimer
    {
        if (!timer) {
            timer = [NSTimer scheduledTimerWithTimeInterval:1.0
                                                     target:self
                                                   selector:@selector(timerFired:)
                                                   userInfo:Nil
                                                    repeats:YES];
            return;
        }
    
        float pauseTime = - 1 * [pauseStart timeIntervalSinceNow];
    
        [timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]];
    }
    
    - (void)stopTimer
    {
        [self updateDisplay];
        [timer invalidate];
        timer = nil;
    }
    
    - (void)updateDisplay
    {
        lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed];
    }
    
    #pragma mark - Helper Functions
    - (void) initRecording
    {
    
        // Set the audio file
        NSString* name = [NSString stringWithFormat:@"recording_%@.m4a", @(recordings.count)]; // creating a unique name for each audio file
        NSURL *outputFileURL = [NSURL fileURLWithPathComponents:@[NSTemporaryDirectory(), name]];
    
        [recordings addObject:outputFileURL];
    
        // Define the recorder settings
        NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
    
        [recordSetting setValue:@(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey];
        [recordSetting setValue:@(44100.0) forKey:AVSampleRateKey];
        [recordSetting setValue:@(1) forKey:AVNumberOfChannelsKey];
    
        NSError* error;
        // Initiate and prepare the recorder
        recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error];
        recorder.delegate = self;
        recorder.meteringEnabled = YES;
        [recorder prepareToRecord];
    
        if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable
            NSLog(@"Error: Audio input device not available!");
            return;
        }
    
        intervalTimeElapsed = 0;
        recordingStartDate = [NSDate date];
    
        if (isInterrupted) {
            intervalTimeElapsed = preInterruptionDuration;
            isInterrupted = NO;
        }
    
        // Activate the AVAudioSession
        [[AVAudioSession sharedInstance] setActive:YES error:&error];
        if (error) {
            NSLog(@"%@", error);
        }
    
        recordingStartDate = [NSDate date];  // Set the recording start date
        [self btnRecordingTapped:btnRecord];
    }
    
    - (void)setupAudioSession
    {
    
        static BOOL audioSessionSetup = NO;
        if (audioSessionSetup) {
            return;
        }
    
        AVAudioSession* session = [AVAudioSession sharedInstance];
    
        [session setCategory:AVAudioSessionCategoryPlayAndRecord
                 withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
                       error:Nil];
    
        [session setMode:AVAudioSessionModeSpokenAudio error:nil];
    
        audioSessionSetup = YES;
    }
    
    // gets an array of audios and append them to one another
    // the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958
    // i modified this logic to append multiple files
    - (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler
    {
        // Create a new audio track we can append to
        AVMutableComposition* composition = [AVMutableComposition composition];
        AVMutableCompositionTrack* appendedAudioTrack =
        [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                 preferredTrackID:kCMPersistentTrackID_Invalid];
    
        // Grab the first audio track that need to be appended
        AVURLAsset* originalAsset = [[AVURLAsset alloc]
                                     initWithURL:urls.firstObject options:nil];
        [urls removeObjectAtIndex:0];
    
        NSError* error = nil;
    
        // Grab the first audio track and insert it into our appendedAudioTrack
        AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
        CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
        [appendedAudioTrack insertTimeRange:timeRange
                                    ofTrack:originalTrack
                                     atTime:kCMTimeZero
                                      error:&error];
        CMTime duration = originalAsset.duration;
    
        if (error) {
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(NO, Nil);
                });
            }
        }
    
        for (NSURL* audioUrl in urls) {
            AVURLAsset* newAsset = [[AVURLAsset alloc]
                                    initWithURL:audioUrl options:nil];
    
            // Grab the rest of the audio tracks and insert them at the end of each other
            AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
            timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
            [appendedAudioTrack insertTimeRange:timeRange
                                        ofTrack:newTrack
                                         atTime:duration
                                          error:&error];
    
            duration = appendedAudioTrack.timeRange.duration;
    
            if (error) {
                if (handler) {
                    dispatch_async(dispatch_get_main_queue(), ^{
                        handler(NO, Nil);
                    });
                }
            }
        }
    
        // Create a new audio file using the appendedAudioTrack
        AVAssetExportSession* exportSession = [AVAssetExportSession
                                               exportSessionWithAsset:composition
                                               presetName:AVAssetExportPresetAppleM4A];
        if (!exportSession) {
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(NO, Nil);
                });
            }
        }
    
        NSArray* appendedAudioPath = @[NSTemporaryDirectory(), @"temp.m4a"]; // name of the final audio file
        exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath];
        exportSession.outputFileType = AVFileTypeAppleM4A;
        [exportSession exportAsynchronouslyWithCompletionHandler:^{
    
            BOOL success = NO;
            // exported successfully?
            switch (exportSession.status) {
                case AVAssetExportSessionStatusFailed:
                    break;
                case AVAssetExportSessionStatusCompleted: {
                    success = YES;
    
                    break;
                }
                case AVAssetExportSessionStatusWaiting:
                    break;
                default:
                    break;
            }
    
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(success, exportSession.outputURL);
                });
            }
        }];
    }
    
    - (void) clearContentsOfDirectory:(NSString*)directory
    {
        NSFileManager *fm = [NSFileManager defaultManager];
        NSError *error = nil;
        for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) {
            [fm removeItemAtURL:[NSURL fileURLWithPathComponents:@[directory, file]] error:&error];
        }
    }
    
    @end
    

    我知道回答问题为时已晚,但希望这有助于其他人!