目标c - 使用语音识别导致iPhone上的视频播放失败

时间:2018-05-01 02:47:59

标签: objective-c avfoundation speech-recognition

在我的应用程序中,我有视频播放和语音识别。如果我不使用语音识别,则视频播放可以正常播放。我也可以将Air播放选项作为“iPhone”和“Apple TV”(我在网络上播放Apple TV)。

但是当我使用语音识别并退出时,视频无法播放。它只是在00.00s的黑色屏幕。 Airplay菜单不再显示“iPhone”和“Apple TV”。它实际上没有选择,并说寻找“Apple TV”。 Airplay屏幕的标题部分显示“iPhone - > IPHONE MICROPHONE”“音乐”。

在控制台中,当视频无法播放时,我收到以下消息:“<<<<< AVOutputDeviceDiscoverySession(FigRouteDiscoverer)>>>> - [AVFigRouteDiscovererOutputDeviceDiscoverySessionImpl outputDeviceDiscoverySessionDessionDidChangeDiscoveryMode:]:将设备发现模式设置为DiscoveryMode_Presence (客户:VideoPlayer)“

在我的代码中,我没有开始发现Airplay或类似的东西,但在使用语音识别器之后它却以某种方式阻碍了它。

以下是启动和停止语音识别的代码:

- (void)startListening {

    // Initialize the AVAudioEngine
    _audioEngine = [[AVAudioEngine alloc] init];

    // Make sure there's not a recognition task already running
    if (_recognitionTask) {
        [_recognitionTask cancel];
        _recognitionTask = nil;
    }

    // Starts an AVAudio Session
    NSError *error;
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    [audioSession setCategory:AVAudioSessionCategoryRecord error:&error];
    [audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error];

    // Starts a recognition process, in the block it logs the input or stops the audio
    // process if there's an error.
    _recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
    AVAudioInputNode *inputNode = _audioEngine.inputNode;
    _recognitionRequest.shouldReportPartialResults = YES;
    _recognitionTask = [_speechRecognizer recognitionTaskWithRequest:_recognitionRequest
                                                       resultHandler:^(SFSpeechRecognitionResult * _Nullable result, NSError * _Nullable error) {
        BOOL isFinal = NO;
        if (result) {
            // Whatever you say in the microphone after pressing the button should be being logged
            // in the console.
            NSLog(@"RESULT:%@",result.bestTranscription.formattedString);
            isFinal = !result.isFinal;

            NSDictionary *dic = [ [NSDictionary alloc] initWithObjectsAndKeys:
                                 result.bestTranscription.formattedString, @"bestTranscription",
                                 result.transcriptions, @"transcriptions",
                                 nil];

            [[NSNotificationCenter defaultCenter] postNotificationName:@"monitorSpeech" object:nil userInfo:dic];

        }
        if (error) {
            NSLog(@"%s - Stopping Audio Engine and resetting recoginitionRequest/Task. Error holds:%@", __PRETTY_FUNCTION__, error);
            [_audioEngine stop];
            [inputNode removeTapOnBus:0];
            _recognitionRequest = nil;
            _recognitionTask = nil;
        }
    }];

    // Sets the recording format
    AVAudioFormat *recordingFormat = [inputNode outputFormatForBus:0];
    [inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) {
        [_recognitionRequest appendAudioPCMBuffer:buffer];
    }];

    // Starts the audio engine, i.e. it starts listening.
    [_audioEngine prepare];
    [_audioEngine startAndReturnError:&error];
    NSLog(@"Say Something, I'm listening");
}


-(void) stopListening {

    [_audioEngine stop];
    //[_audioEngine.inputNode removeTapOnBus:0];
    [_audioEngine reset];
    [_recognitionRequest endAudio];
    [_recognitionTask cancel];
    //[_recognitionTask finish];
    _recognitionTask = nil;
    _recognitionRequest = nil;

}

对于视频播放我正在使用AVPlayerViewController。

请告知如何解决此问题。谢谢!

0 个答案:

没有答案