如何将arrayBuffer作为音频文件播放?

时间:2018-05-24 18:30:37

标签: javascript web-audio web-audio-api

我通过socket.io事件收到了一个arrayBuffer,希望能够处理流并将其作为音频文件播放。

我正在接收缓冲区:

retrieveAudioStream = () => {
  this.socket.on('stream', (arrayBuffer) => {
    console.log('arrayBuffer', arrayBuffer)
  })
}

是否可以将src元素的<audio/>属性设置为缓冲区?如果不能如何播放传入的缓冲流?

编辑:

要说明我如何获取音频输入并将其传输:

window.navigator.getUserMedia(constraints, this.initializeRecorder, this.handleError);



initializeRecorder = (stream) => {
    const audioContext = window.AudioContext;
    const context = new audioContext();
    const audioInput = context.createMediaStreamSource(stream);
    const bufferSize = 2048;
    // create a javascript node
    const recorder = context.createScriptProcessor(bufferSize, 1, 1);
    // specify the processing function
    recorder.onaudioprocess = this.recorderProcess;
    // connect stream to our recorder
    audioInput.connect(recorder);
    // connect our recorder to the previous destination
    recorder.connect(context.destination);
  }

这是我通过socket.io事件

收到inputBuffer事件和流的地方
  recorderProcess = (e) => {
    const left = e.inputBuffer.getChannelData(0);
    this.socket.emit('stream', this.convertFloat32ToInt16(left))
  }

编辑2: 添加Raymonds建议:

retrieveAudioStream = () => {

  const audioContext = new window.AudioContext();

  this.socket.on('stream', (buffer) => {

    const b = audioContext.createBuffer(1, buffer.length, audioContext.sampleRate); 
    b.copyToChannel(buffer, 0, 0)
    const s = audioContext.createBufferSource(); 
    s.buffer = b

  })
}

获取错误:NotSupportedError: Failed to execute 'createBuffer' on 'BaseAudioContext': The number of frames provided (0) is less than or equal to the minimum bound (0).

1 个答案:

答案 0 :(得分:0)

基于对initializeRecorderrecorderProcess所做的快速阅读,看起来你在某些方面将float32样本转换为int16并且在某些情况下将其发送到retrieveAudioStream方式。

如果这是正确的,那么arrayBuffer是一个int16值的数组。将它们转换为float32(最有可能将每个值除以32768)并将它们保存在Float32Array中。然后创建相同长度的AudioBuffercopyToChannel(float32Array, 0, 0)以将值写入AudioBuffer。使用此缓冲区的AudioBufferSourceNode播放音频。