得到错误如[Utility] + [AFAggregator logDictationFailedWithError:] Error Domain = kAFAssistantErrorDomain Code = 209"(null)"

时间:2017-02-09 19:22:38

标签: objective-c speech-recognition text-to-speech speech

我正在尝试开发一个既有语音又有文本到语音的应用程序,这意味着,当我说话时文本到语音将会关闭,当我播放语音文本到语音时,语音识别将会处于关闭状态。

步骤: 1 - 打开app-app应该开始识别语音 2 - 开始说出这样的话:你好 3 - 然后播放 - 如果我们说播放应用程序应该播放屏幕上显示的文本。 4 - 再次开始说话 - 再次播放语音后,我想开始语音识别过程。 上面的步骤是我期望的常规过程。 语音识别应该继续运行。对于语音到文本,我使用了Apples Speech framework.For Text To speech我使用了AVFoundation和MediaPlayer库。

我在这里遇到一个问题,即 1 - 我说你好,它在屏幕上打印Hello 2 - 我说游戏作为一个命令应用正在发出声音 3 - 之后我收到错误 [实用程序] + [AFAggregator logDictationFailedWithError:]错误域= kAFAssistantErrorDomain代码= 209"(null)" 。我不知道它为什么会发生。我在谷歌搜索过,但我没有得到任何公平的解决方案,请帮我解决这个问题。

这是我的代码

@interface ViewController ()

@end

@implementation ViewController
{
    SFSpeechAudioBufferRecognitionRequest *recognitionRequest;
    SFSpeechRecognitionTask *recognitionTask;
    AVAudioEngine *audioEngine;
    NSMutableArray *speechStringsArray;
    BOOL SpeechToText;
    NSString* resultString;
    NSString *str ;
    NSString *searchString;
    AVAudioInputNode *inputNode;
    BOOL didStartSpeaking;
    NSString *textToSpeak;
    NSMutableArray *speechCommandArray;

}

- (void)viewDidLoad {
    [super viewDidLoad];

    //Speech To Text ****
    speechStringsArray = [[NSMutableArray alloc]init];
    [self.textView resignFirstResponder];
    [self.textView setDelegate:self];


       ////             *****
    // Initialize background audio session
    NSError *error = NULL;
    AVAudioSession *session = [AVAudioSession sharedInstance];
    [session setCategory:AVAudioSessionCategoryPlayback error:&error];
    if(error) {
        NSLog(@"@error: %@", error);
    }
    [session setActive:YES error:&error];
    if (error) {
        NSLog(@"@error: %@", error);
    }

    // Enabled remote controls
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];

    // Voice setup
    self.voicePicker.delegate = self;
    self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:@"en-us"];
    self.voices = [NSMutableArray arrayWithObjects:
                   @{@"voice" : @"en-us", @"label" : @"American English (Female)"},
                   @{@"voice" : @"en-au", @"label" : @"Australian English (Female)"},
                   @{@"voice" : @"en-gb", @"label" : @"British English (Male)"},
                   @{@"voice" : @"en-ie", @"label" : @"Irish English (Female)"},
                   @{@"voice" : @"en-za", @"label" : @"South African English (Female)"},
                   nil];

    // Synthesizer setup
    self.synthesizer = [[AVSpeechSynthesizer alloc] init];
    self.synthesizer.delegate = self;

       // This notifcation is generated from the AppDelegate applicationDidBecomeActive method to make sure that if the play or pause button is updated in the background then the button will be updated in the toolbar
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateToolbar) name:@"updateToolbar" object:nil];

    if (self.textView.text.length >0) {

        [self.playPauseBarButtonItem setEnabled:YES];
     }
    else
    {
        [self.playPauseBarButtonItem setEnabled:NO];
        }}


-(void)viewDidAppear:(BOOL)animated
{

    self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:@"en-US en-UK en-IN"]];



    self.speechRecognizer.delegate = self;

    audioEngine = [[AVAudioEngine alloc]init];

    self.textView.text = @"";

    [SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
        switch (authStatus) {
            case SFSpeechRecognizerAuthorizationStatusAuthorized:
                //User gave access to speech recognition
                NSLog(@"Authorized");

                [self start_record];


                break;

            case SFSpeechRecognizerAuthorizationStatusDenied:
                //User denied access to speech recognition
                NSLog(@"AuthorizationStatusDenied");

                break;

            case SFSpeechRecognizerAuthorizationStatusRestricted:
                //Speech recognition restricted on this device
                NSLog(@"AuthorizationStatusRestricted");

                break;

            case SFSpeechRecognizerAuthorizationStatusNotDetermined:
                //Speech recognition not yet authorized

                break;

            default:
                NSLog(@"Default");
                break;
        }
    }];

    //MARK : Interface Builder Actions


}

 //methods for increase the speed    
- (IBAction)handleSpeedStepper:(UIStepper *)sender
{
    double speedValue = self.speedStepper.value;
    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speedValue]];
}
//methods for increase the pitch
- (IBAction)handlePitchStepper:(UIStepper *)sender
{
    double pitchValue = self.pitchStepper.value;
    [self.pitchValueLabel setText:[NSString stringWithFormat:@"%.1f", pitchValue]];
}
//methods for play and pause the voice
- (IBAction)handlePlayPauseButton:(UIBarButtonItem *)sender
{

    if (self.synthesizer.speaking && !self.synthesizer.paused) {

        if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
            // Stop immediately
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
        }

        else
        {
            // Stop at end of current word
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];

        }


        [self updateToolbarWithButton:@"play"];
    }

    else if (self.synthesizer.paused) {
        [self.synthesizer continueSpeaking];
        [self updateToolbarWithButton:@"pause"];
    }


    else {

        if(self.textView.text.length>0)
        {


        [self speakUtterance];

        [self updateToolbarWithButton:@"pause"];
        }
        else
        {
            [self updateToolbarWithButton:@"play"];

        }

    }
}

//用于识别用户说出的语音的方法 - 识别

   -(void)start_record{


    NSError * outError;

    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&outError];

    [audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
    [audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation  error:&outError];




    recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc]init];


    inputNode = audioEngine.inputNode;


    if (recognitionRequest  == nil) {
        NSLog(@"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
    }
    if (inputNode == nil) {
        NSLog(@"Audio engine has no input node ");


    }

    //configure request so that results are returned before audio recording is finished


    [recognitionRequest setShouldReportPartialResults:YES];


    // A recognition task represents a speech recognition session.
    //We keep a reference to the task so that it can be cancelled .


    recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError *  error1) {

        BOOL isFinal = false;

        if ((result = result)) {


            NSString *speech = result.bestTranscription.formattedString;
            NSLog(@"the speech:%@",speech);

            //storing the results in array.
            for (int i = 0 ;i <speechStringsArray.count;i++)
            {

                str = [speechStringsArray objectAtIndex:i];

                NSRange range = [speech rangeOfString:str options:NSCaseInsensitiveSearch];
                NSLog(@"found: %@", (range.location != NSNotFound) ? @"Yes" : @"No");

                if (range.location != NSNotFound) {

                    NSLog(@"inside if");
                    resultString = [speech stringByReplacingCharactersInRange:range withString:@""];

                    speech = resultString;

                    NSLog(@" the result is : %@",resultString);

                }
                else
                {
                    NSLog(@"ifcondition fails");
                    //speech = resultString;
                    resultString = speech;

                }

            }


            //commands are used like play to play a voice
            if (resultString.length>0) {


                if ([resultString isEqualToString:@" play"]) {

                    [self play];


                }
                else if ([resultString isEqualToString:@" exit"])
                {
                    UIApplication *app = [UIApplication sharedApplication];
                    [self applicationWillTerminate:app];

                }

                else if ([speech isEqualToString:@" mute"])
                {
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:0];



                }
                else if ([speech isEqualToString:@" up"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    vol ++;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];

                }
                else if ([speech isEqualToString:@" down"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    vol --;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];
                }

                else if ([speech isEqualToString:@" speed"])
                {
                    double speedValue = self.speedStepper.value;
                    NSString *speedup = self.speedValueLabel.text;
                    double speed = [speedup doubleValue];

                    speed += 0.1;

                    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speed]];
                }

                else
                {
                    NSLog(@"coming");
                    self.textView.text = [NSString stringWithFormat:@"%@%@",self.textView.text,resultString];


          if (![resultString isEqualToString:@" play"]) {

               [speechStringsArray addObject:resultString];

  }



                }



            }

            else
            {

                if ([speech isEqualToString:@"Play"]) {

                    [self play];


                }



               if ([speech isEqualToString:@"Speed"])

                {
                    double speedValue = self.speedStepper.value;
                    NSString *speedup = self.speedValueLabel.text;
                    double speed = [speedup doubleValue];

                    speed += 0.1;

                    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speed]];
                }

                else if ([speech isEqualToString:@"Exit"])

                {
                   //[[NSThread mainThread] exit];
                    UIApplication *app = [UIApplication sharedApplication];
                    [self applicationWillTerminate:app];

                }

                else if ([speech isEqualToString:@"Mute"])
                {
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:0];



                }
                else if ([speech isEqualToString:@"Up"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                  //  vol +=1.0;
                    vol ++;

                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];

                }
                else if ([speech isEqualToString:@"Down"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    //vol -= 1.0;
                    vol --;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];
                }
                else
                {


                    if (![speech isEqualToString:@"Play"]) {

                        [speechStringsArray addObject:speech];

                        if (self.textView.text.length > 0) {

                            self.textView.text = [NSString stringWithFormat:@"%@%@",self.textView.text,speech];


                        }
                        else

                        {
                            self.textView.text = speech;

                        }


                    }

                }


            }

            NSLog(@" array %@",speechStringsArray);


            isFinal = result.isFinal;

        }


        if (error1 != nil || isFinal) {


            recognitionRequest = nil;
            recognitionTask = nil;


            [audioEngine stop];
            [inputNode removeTapOnBus:0];
            [recognitionRequest endAudio];


            [self start_record];


        }

    }];

    AVAudioFormat *recordingFormat =  [inputNode outputFormatForBus:0];

    [inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){

        [recognitionRequest appendAudioPCMBuffer:buffer];


    }



     ];
    NSError *error1;
    [audioEngine prepare];
    [audioEngine startAndReturnError:&error1];




}

-(void)play
{
    if (self.synthesizer.speaking && !self.synthesizer.paused) {

        if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
            // Stop immediately
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
        }

        else
        {
            // Stop at end of current word
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];

        }


        [self updateToolbarWithButton:@"play"];
    }

    else if (self.synthesizer.paused) {
        [self.synthesizer continueSpeaking];
        [self updateToolbarWithButton:@"pause"];
    }


    else {

        if(self.textView.text.length>0)
        {


            [self speakUtterance];

            [self updateToolbarWithButton:@"pause"];
        }
        else
        {
            [self updateToolbarWithButton:@"play"];

        }

    }

}
 - (void)applicationWillTerminate:(UIApplication *)application
{
    exit(0);

}

//text to speech method

- (void)speakUtterance
{


    if (audioEngine.isRunning) {

        NSLog(@"Running");
        recognitionRequest = nil;
        recognitionTask = nil;
        [speechStringsArray removeAllObjects];
        resultString = @"";



        [audioEngine stop];
        [inputNode removeTapOnBus:0];
        [recognitionRequest endAudio];


    }

    if (self.textView.text.length > 0) {
        NSLog(@"speakUtterance");
        didStartSpeaking = NO;
        textToSpeak = [NSString stringWithFormat:@"%@", self.textView.text];
        AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:textToSpeak];
        utterance.rate = self.speedStepper.value;
       utterance.pitchMultiplier = self.pitchStepper.value;
        utterance.voice = self.voice;
        [self.synthesizer speakUtterance:utterance];
        [self displayBackgroundMediaFields];

      //y  NSLog(@" after speaking:%@",self.textView.text);


    }

    else
    {
        [self updateToolbarWithButton:@"play"];

    }





}

- (void)displayBackgroundMediaFields
{
    MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"Play"]];

    NSDictionary *info = @{ MPMediaItemPropertyTitle: self.textView.text,
                            MPMediaItemPropertyAlbumTitle: @"TextToSpeech App",
                            MPMediaItemPropertyArtwork: artwork};

    [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = info;


}

- (void)updateToolbar
{
    if (self.synthesizer.speaking && !self.synthesizer.paused) {
        [self updateToolbarWithButton:@"pause"];
    }
    else {
        [self updateToolbarWithButton:@"play"];
    }


}

- (void)updateToolbarWithButton:(NSString *)buttonType
{


       NSLog(@"updateToolbarWithButton: %@", buttonType);
    UIBarButtonItem *audioControl;
    if ([buttonType isEqualToString:@"play"]) {
        // Play
        audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPlay target:self action:@selector(handlePlayPauseButton:)];
    }
    else {
        // Pause
        audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPause target:self action:@selector(handlePlayPauseButton:)];


    }

    UIBarButtonItem *flexibleItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace target:nil action:nil];

    [self.toolbar setItems:@[flexibleItem, audioControl, flexibleItem]];
}



- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent
{
    NSLog(@"receivedEvent: %@", receivedEvent);
    if (receivedEvent.type == UIEventTypeRemoteControl) {

        switch (receivedEvent.subtype) {

            case UIEventSubtypeRemoteControlPlay:
                NSLog(@"UIEventSubtypeRemoteControlPlay");
                if (self.synthesizer.speaking) {
                    [self.synthesizer continueSpeaking];
                }
                else {
                    [self speakUtterance];
                }
                break;

            case UIEventSubtypeRemoteControlPause:
                NSLog(@"pause - UIEventSubtypeRemoteControlPause");

                if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
                    // Pause immediately
                    [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
                }
                else {
                    // Pause at end of current word
                    [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
                }
                break;

            case UIEventSubtypeRemoteControlTogglePlayPause:
                if (self.synthesizer.paused) {
                    NSLog(@"UIEventSubtypeRemoteControlTogglePlayPause");
                    [self.synthesizer continueSpeaking];
                }
                else {
                    NSLog(@"UIEventSubtypeRemoteControlTogglePlayPause");
                    if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
                        // Pause immediately
                        [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
                    }
                    else {
                        // Pause at end of current word
                        [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
                    }
                }
                break;

            case UIEventSubtypeRemoteControlNextTrack:
                NSLog(@"UIEventSubtypeRemoteControlNextTrack - appropriate for playlists");
                break;

            case UIEventSubtypeRemoteControlPreviousTrack:
                NSLog(@"UIEventSubtypeRemoteControlPreviousTrack - appropriatefor playlists");
                break;

            default:
                break;
        }
    }
}

#pragma mark UIPickerViewDelegate Methods

- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView
{
    return 1;
}

- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component
{
    return self.voices.count;
}

- (UIView *)pickerView:(UIPickerView *)pickerView viewForRow:(NSInteger)row forComponent:(NSInteger)component reusingView:(UIView *)view
{
    UILabel *rowLabel = [[UILabel alloc] init];
    NSDictionary *voice = [self.voices objectAtIndex:row];
    rowLabel.text = [voice objectForKey:@"label"];
    return rowLabel;
}

- (void)pickerView:(UIPickerView *)pickerView didSelectRow: (NSInteger)row inComponent:(NSInteger)component
{
    NSDictionary *voice = [self.voices objectAtIndex:row];
    NSLog(@"new picker voice selected with label: %@", [voice objectForKey:@"label"]);
    self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:[voice objectForKey:@"voice"]];
}

#pragma mark SpeechSynthesizerDelegate methods

//在这种方法中,我在播放语音后调用语音识别方法。

- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
    // This is a workaround of a bug. When we change the voice the first time the speech utterence is set fails silently. We check that the method willSpeakRangeOfSpeechString is called and set didStartSpeaking to YES there. If this method is not called (silent fail) then we simply request to speak again.
    if (!didStartSpeaking) {
        [self speakUtterance];

    }
    else {
        [self updateToolbarWithButton:@"play"];


        //here i am checking weather speech recognition is running or not , if not i am starting the speech recognition after playing a voice i.e,text to speech.

            if (!audioEngine.isRunning) {

                double delayInSeconds = 1.5;
                dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
                dispatch_after(popTime, dispatch_get_main_queue(), ^(void){


                  ///  NSLog(@"Not Running");

                    [self start_record];

                });


            }




    }


}

- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer willSpeakRangeOfSpeechString:(NSRange)characterRange utterance:(AVSpeechUtterance *)utterance
{
    didStartSpeaking = YES;

}

#pragma mark UITextViewDelegate Methods

#pragma mark Cleanup Methods

- (void)dealloc
{
    [[NSNotificationCenter defaultCenter] removeObserver:self name:@"updateToolbar" object:nil];
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}
@end

在上面的代码中,在didFinishSpeechUtterance方法中,在结束语音之后,即文本到语音我正在调用开始记录,这将再次开始语音识别,所以播放语音控件将开始记录方法,在那< / p>

         recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError *  error1) {

在这些行中它抛出了一个错误,如 [实用程序] + [AFAggregator logDictationFailedWithError:]错误域= kAFAssistantErrorDomain代码= 209&#34;(null)&#34; 我没有&#39;我知道为什么......

请帮我解决这个问题.. 在此先感谢!!!

1 个答案:

答案 0 :(得分:1)

我收到了这个 209 错误,偶尔还有 203。 我尝试了很多,终于能够解决它。 这是我所做的:

  1. 我没有做[recogTask finish],而是做了[recogTask cancel]

  2. 然后我在一个 while 循环中等待 recogTask 取消,然后再将其设置为 nil 或任何其他步骤。例如我做到了:

    while (recogTask && recogTask.cancelled == NO) { [NSThread sleepForTimeInterval:0.1]; }

所以我没有收到任何 209 或 203 代码错误,而且我每 20 秒停止 recog 2 秒。还要看2秒的差距能不能缩小。

谢谢。