我想将两个音频加入一个,以便与客户端上的HTML5同步。我用Web Audio API看到它可以做很多事情,但我还是找不到。
我有两个音频文件(.mp3,.wav ...)的链接,我想要的是同步这两个音频文件,如语音和歌曲。我不希望他们一个接一个地想要同步。
我会使用HTML5在客户端上完成所有操作,而无需使用服务器。这可能吗?
非常感谢你的帮助。
答案 0 :(得分:0)
所以我理解,你有两个音频文件,你想在客户端上一起渲染。 Web音频API可以非常轻松地在JavaScript中完成此操作。一个好的开始是http://www.html5rocks.com/en/tutorials/webaudio/intro/
示例脚本将是
var context = new(window.AudioContext || window.webkitAudioContext) // Create an audio context
// Create an XML HTTP Request to collect your audio files
// https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest
var xhr1 = new XMLHttpRequest();
var xhr2 = new XMLHttpRequest();
var audio_buffer_1, audio_buffer_2;
xhr1.open("GET","your_url_to_audio_1");
xhr1.responseType = 'arraybuffer';
xhr1.onload = function() {
// Decode the audio data
context.decodeAudioData(request.response, function(buffer) {
audio_buffer_1 = buffer;
}, function(error){});
};
xhr2.open("GET","your_url_to_audio_2");
xhr2.responseType = 'arraybuffer';
xhr2.onload = function() {
// Decode the audio data
context.decodeAudioData(request.response, function(buffer) {
audio_buffer_2 = buffer;
}, function(error){});
};
xhr1.send();
xhr2.send();
这些会将两个文件的Web Audio API缓冲节点(https://webaudio.github.io/web-audio-api/#AudioBuffer)加载到全局变量audio_buffer_1和audio_buffer_2中。
现在要创建一个新的音频缓冲区,您需要使用离线音频上下文
// Assumes both buffers are of the same length. If not you need to modify the 2nd argument below
var offlineContext = new OfflineAudioContext(context.destination.channelCount,audio_buffer_1.duration*context.sampleRate , context.sampleRate);
var summing = offlineContext.createGain();
summing.connect(offlineContext.destination);
// Build the two buffer source nodes and attach their buffers
var buffer_1 = offlineContext.createBufferSource();
var buffer_2 = offlineContext.createBufferSource();
buffer_1.buffer = audio_buffer_1;
buffer_2.buffer = audio_buffer_2;
// Do something with the result by adding a callback
offlineContext.oncomplete = function(renderedBuffer){
// Place code here
};
//Begin the summing
buffer_1.start(0);
buffer_2.start(0);
offlineContext.startRendering();
完成后,您将在回调函数中收到一个名为renderedBuffer的新缓冲区,它将是两个缓冲区的直接求和。