使用AVAudioRecorder

时间:2016-06-21 21:07:51

标签: swift ios10

与iOS 10一起,苹果发布了一个识别语音的新框架。可以通过附加AVAudioPCMBuffers或将URL提供给m4a来将数据传递到此框架。目前,语音识别使用前者,但这只有在某人完成并且不是实时之后才有可能。这是代码:

let audioSession = AVAudioSession.sharedInstance()
var audioRecorder:AVAudioRecorder!;
var soundURLGlobal:URL!;

function setUp(){
    let recordSettings = [AVSampleRateKey : NSNumber(value: Float(44100.0)),
                          AVFormatIDKey : NSNumber(value: Int32(kAudioFormatMPEG4AAC)),
                          AVNumberOfChannelsKey : NSNumber(value: 1),
                          AVEncoderAudioQualityKey : NSNumber(value: Int32(AVAudioQuality.medium.rawValue))]

    let fileManager = FileManager.default()
    let urls = fileManager.urlsForDirectory(.documentDirectory, inDomains: .userDomainMask)
    let documentDirectory = urls[0] as NSURL
    let soundURL = documentDirectory.appendingPathComponent("sound.m4a")
    soundURLGlobal=soundURL;


    do {
        try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
        try audioRecorder = AVAudioRecorder(url: soundURL!, settings: recordSettings)
        audioRecorder.prepareToRecord()
    } catch {}
}

function start(){
    do {
        try audioSession.setActive(true)
        audioRecorder.record()
    } catch {}
}

function stop(){
    audioRecorder.stop()
    let request=SFSpeechURLRecognitionRequest(url: soundURLGlobal!)
    let recognizer=SFSpeechRecognizer();
    recognizer?.recognitionTask(with: request, resultHandler: { (result, error) in
        if(result!.isFinal){
            print(result?.bestTranscription.formattedString)
        }
    })

}

我正在尝试转换此但我无法找到AVAudioPCMBuffer的位置。

谢谢,

1 个答案:

答案 0 :(得分:0)

好主题。

嗨B人

这里是解决方案的主题 Tap Mic Input Using AVAudioEngine in Swift

参见Wwdc 2014讲座 502 - 实践中的AVAudioEngine capture microphone =>在20分钟 用tap代码创建缓冲区=>在21.50。

这里是swift 3代码

@IBAction func button01Pressed(_ sender: Any) {

    let inputNode = audioEngine.inputNode
    let bus = 0
    inputNode?.installTap(onBus: bus, bufferSize: 2048, format: inputNode?.inputFormat(forBus: bus)) {
        (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

            var theLength = Int(buffer.frameLength)
            print("theLength = \(theLength)")

            var samplesAsDoubles:[Double] = []
            for i in 0 ..< Int(buffer.frameLength)
            {
                var theSample = Double((buffer.floatChannelData?.pointee[i])!)
                samplesAsDoubles.append( theSample )
            }

            print("samplesAsDoubles.count = \(samplesAsDoubles.count)")

    }

    audioEngine.prepare()
    try! audioEngine.start()

}

停止音频

func stopAudio()
    {

        let inputNode = audioEngine.inputNode
        let bus = 0
        inputNode?.removeTap(onBus: bus)
        self.audioEngine.stop()

    }