Safari上的HTML5音频标签有延迟

时间:2012-03-21 19:20:37

标签: javascript html5 html5-audio

我正在尝试完成一个简单的类似涂鸦的行为,使用html标记在点击时响起mp3 / ogg声音。它应该可以在Firefox,Safari和Safari iPad下运行。

我已经尝试了很多方法并且已经达到了这个目的:

HTML

    <span id="play-blue-note" class="play blue" ></span>
    <span id="play-green-note" class="play green" ></span>


    <audio id="blue-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="blue.mp3" />
        <source src="blue.ogg" />
        <!-- now include flash fall back -->
    </audio>

    <audio id="green-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="green.mp3" />
        <source src="green.ogg" />
    </audio>

JS

function addSource(elem, path) {
    $('<source>').attr('src', path).appendTo(elem);
}

$(document).ready(function() {


    $('body').delegate('.play', 'click touchstart', function() {
        var clicked = $(this).attr('id').split('-')[1];

        $('#' + clicked + '-note').get(0).play();



    });

});  

您可以在ign.com.uy/loog /

上看到整个演示

这似乎在Firefox下很有用,但是每当你点击时Safari似乎都有延迟,即使你多次点击并且音频文件已加载。在iPad上的Safari上,它几乎无法预测。

此外,当我在本地测试时,Safari的性能似乎有所改善,我猜测Safari每次都在下载文件。这可能吗?我怎么能避免这个? 谢谢!

10 个答案:

答案 0 :(得分:8)

几分钟前我刚回答了另一个iOS / <audio>问题。似乎也适用于此处:

禁用iOS设备上的预载<audio><video>以节省带宽。

  

在iOS上的Safari(适用于所有设备,包括iPad)中,用户可以使用   在蜂窝网络上,按数据单位收费,预加载和   自动播放被禁用。在用户启动数据之前,不会加载任何数据。

来源:Safari Developer Library

答案 1 :(得分:4)

Safari的问题在于它每次都会为正在播放的音频文件发出请求。您可以尝试创建HTML5缓存清单。不幸的是,我的经验是你一次只能将一个音频文件添加到缓存中。解决方法可能是将所有音频文件按顺序合并到单个音频文件中,并根据所需声音开始在特定位置播放。您可以创建一个间隔来跟踪当前播放位置,并在达到某个时间戳后暂停它。

在此处阅读有关创建HTML5缓存清单的更多信息:

http://www.html5rocks.com/en/tutorials/appcache/beginner/

http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html

希望它有所帮助!

答案 2 :(得分:3)

Apple决定(为了节省celluar的费用)不预加载<audio><video> HTML元素。

来自the Safari Developer Library

  

在iOS上的Safari(适用于所有设备,包括iPad)中,用户可以使用   在蜂窝网络上,按数据单位收费,预加载和   自动播放被禁用。在用户启动数据之前不会加载任何数据。   这意味着JavaScript play()load()方法也处于非活动状态   直到用户启动播放,除非使用play()load()方法   由用户操作触发。换句话说,用户发起的播放   按钮工作,但onLoad =“play()”事件不起作用。

     

播放电影: <input type="button" value="Play" onClick="document.myMovie.play()">

     

这在iOS上无效: <body onLoad="document.myMovie.play()">


我认为你不能绕过这个限制,但你可能会。

请记住: Google是您最好的朋友。


更新:经过一些实验,我找到了一种使用JavaScript播放<audio>的方法:

var vid = document.createElement("iframe");
vid.setAttribute('src', "http://yoursite.com/yourvideooraudio.mp4"); // replace with actual source
vid.setAttribute('width', '1px');
vid.setAttribute('height', '1px');
vid.setAttribute('scrolling', 'no');
vid.style.border = "0px";
document.body.appendChild(vid);

注意:我只尝试使用<audio>


更新2: jsFiddle here。似乎工作。

答案 3 :(得分:1)

在桌面Safari上,添加AudioContext可解决以下问题:

const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioCtx = new AudioContext();

我偶然发现了它,所以我不知道它为什么起作用,但这消除了我的应用程序的延迟。

答案 4 :(得分:0)

你的音频文件被加载一次然后缓存..重复播放声音,即使在页面刷新后,也没有在Safari中引起进一步的HTTP请求..

我刚看了一下你在音频编辑器中的一个声音 - 文件开头有一点点沉默......这会表现为延迟..

Web Audio API对你来说是一个可行的选择吗?

答案 5 :(得分:0)

我遇到同样的问题。奇怪的是我正在预加载文件。但是使用WiFi它可以正常播放,但在手机数据上,开始前有很长的延迟。我认为这与加载速度有关,但在加载所有图像和音频文件之前,我不会开始播放我的场景。任何建议都会很棒。 (我知道这不是一个答案,但我认为做一个重复的帖子更好)。

答案 6 :(得分:0)

Safari iOS上的HTML5音频延迟(<audio>元素与AudioContext

是的,当使用本机<audio>元素时,Safari iOS会有音频延迟...但是可以通过使用AudioContext来克服。

我的代码段基于我从https://lowlag.alienbill.com/中学到的内容

请在您自己的iOS设备上测试功能(我在iOS 12中进行了测试) https://fiddle.jshell.net/eLya8fxb/51/show/

JS Fiddle的片段 https://jsfiddle.net/eLya8fxb/51/

// Requires jQuery 

// Adding:
// Strip down lowLag.js so it only supports audioContext (So no IE11 support (only Edge))
// Add "loop" monkey patch needed for looping audio (my primary usage)
// Add single audio channel - to avoid overlapping audio playback

// Original source: https://lowlag.alienbill.com/lowLag.js

if (!window.console) console = {
  log: function() {}
};

var lowLag = new function() {
  this.someVariable = undefined;
  this.showNeedInit = function() {
    lowLag.msg("lowLag: you must call lowLag.init() first!");
  }
  this.load = this.showNeedInit;
  this.play = this.showNeedInit;
  this.pause = this.showNeedInit;
  this.stop = this.showNeedInit;
  this.switch = this.showNeedInit;
  this.change = this.showNeedInit;
  
  this.audioContext = undefined;
  this.audioContextPendingRequest = {};
  this.audioBuffers = {};
  this.audioBufferSources = {};
  this.currentTag = undefined;
  this.currentPlayingTag = undefined;

  this.init = function() {
    this.msg("init audioContext");
    this.load = this.loadSoundAudioContext;
    this.play = this.playSoundAudioContext;
    this.pause = this.pauseSoundAudioContext;
    this.stop = this.stopSoundAudioContext;
    this.switch = this.switchSoundAudioContext;
    this.change = this.changeSoundAudioContext;

    if (!this.audioContext) {
      this.audioContext = new(window.AudioContext || window.webkitAudioContext)();
    }
  }

  //we'll use the tag they hand us, or else the url as the tag if it's a single tag,
  //or the first url 
  this.getTagFromURL = function(url, tag) {
    if (tag != undefined) return tag;
    return lowLag.getSingleURL(url);
  }
  this.getSingleURL = function(urls) {
    if (typeof(urls) == "string") return urls;
    return urls[0];
  }
  //coerce to be an array
  this.getURLArray = function(urls) {
    if (typeof(urls) == "string") return [urls];
    return urls;
  }

  this.loadSoundAudioContext = function(urls, tag) {
    var url = lowLag.getSingleURL(urls);
    tag = lowLag.getTagFromURL(urls, tag);
    lowLag.msg('webkit/chrome audio loading ' + url + ' as tag ' + tag);
    var request = new XMLHttpRequest();
    request.open('GET', url, true);
    request.responseType = 'arraybuffer';

    // Decode asynchronously
    request.onload = function() {
      // if you want "successLoadAudioFile" to only be called one time, you could try just using Promises (the newer return value for decodeAudioData)
      // Ref: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData

      //Older callback syntax:
      //baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
      //Newer promise-based syntax:
      //Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);


      // ... however you might want to use a pollfil for browsers that support Promises, but does not yet support decodeAudioData returning a Promise.
      // Ref: https://github.com/mohayonao/promise-decode-audio-data
      // Ref: https://caniuse.com/#search=Promise

      // var retVal = lowLag.audioContext.decodeAudioData(request.response);

      // Note: "successLoadAudioFile" is called twice. Once for legacy syntax (success callback), and once for newer syntax (Promise)
      var retVal = lowLag.audioContext.decodeAudioData(request.response, successLoadAudioFile, errorLoadAudioFile);
      //Newer versions of audioContext return a promise, which could throw a DOMException
      if (retVal && typeof retVal.then == 'function') {
        retVal.then(successLoadAudioFile).catch(function(e) {
          errorLoadAudioFile(e);
          urls.shift(); //remove the first url from the array
          if (urls.length > 0) {
            lowLag.loadSoundAudioContext(urls, tag); //try the next url
          }
        });
      }
    };

    request.send();

    function successLoadAudioFile(buffer) {
      lowLag.audioBuffers[tag] = buffer;
      if (lowLag.audioContextPendingRequest[tag]) { //a request might have come in, try playing it now
        lowLag.playSoundAudioContext(tag);
      }
    }

    function errorLoadAudioFile(e) {
      lowLag.msg("Error loading webkit/chrome audio: " + e);
    }
  }

  this.playSoundAudioContext = function(tag) {
    var context = lowLag.audioContext;

    // if some audio is currently active and hasn't been switched, or you are explicitly asking to play audio that is already active... then see if it needs to be unpaused
    // ... if you've switch audio, or are explicitly asking to play new audio (that is not the currently active audio) then skip trying to unpause the audio
    if ((lowLag.currentPlayingTag && lowLag.currentTag && lowLag.currentPlayingTag === lowLag.currentTag) || (tag && lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag)) {
      // find currently paused audio (suspended) and unpause it (resume)
      if (context !== undefined) {
        // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
        if (context.state === 'suspended') {
          context.resume().then(function() {
            lowLag.msg("playSoundAudioContext resume " + lowLag.currentPlayingTag);
            return;
          }).catch(function(e) {
            lowLag.msg("playSoundAudioContext resume error for " + lowLag.currentPlayingTag + ". Error: " + e);
          });
          return;
        }
      }
    }
    
    if (tag === undefined) {
      tag = lowLag.currentTag;
    }

    if (lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag) {
      // ignore request to play same sound a second time - it's already playing
      lowLag.msg("playSoundAudioContext already playing " + tag);
      return;
    } else {
      lowLag.msg("playSoundAudioContext " + tag);
    }

    var buffer = lowLag.audioBuffers[tag];
    if (buffer === undefined) { //possibly not loaded; put in a request to play onload
      lowLag.audioContextPendingRequest[tag] = true;
      lowLag.msg("playSoundAudioContext pending request " + tag);
      return;
    }

    // need to create a new AudioBufferSourceNode every time... 
    // you can't call start() on an AudioBufferSourceNode more than once. They're one-time-use only.
    var source;
    source = context.createBufferSource(); // creates a sound source
    source.buffer = buffer; // tell the source which sound to play
    source.connect(context.destination); // connect the source to the context's destination (the speakers)
    source.loop = true;
    lowLag.audioBufferSources[tag] = source;

    // find current playing audio and stop it
    var sourceOld = lowLag.currentPlayingTag ? lowLag.audioBufferSources[lowLag.currentPlayingTag] : undefined;
    if (sourceOld !== undefined) {
      if (typeof(sourceOld.noteOff) == "function") {
        sourceOld.noteOff(0);
      } else {
        sourceOld.stop();
      }
      lowLag.msg("playSoundAudioContext stopped " + lowLag.currentPlayingTag);
      lowLag.audioBufferSources[lowLag.currentPlayingTag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }

    // play the new source audio
    if (typeof(source.noteOn) == "function") {
      source.noteOn(0);
    } else {
      source.start();
    }
    lowLag.currentTag = tag;
    lowLag.currentPlayingTag = tag;
    
    if (context.state === 'running') {
      lowLag.msg("playSoundAudioContext started " + tag);
    } else if (context.state === 'suspended') {
      /// if the audio context is in a suspended state then unpause (resume)
      context.resume().then(function() {
        lowLag.msg("playSoundAudioContext started and then resumed " + tag);
      }).catch(function(e) {
        lowLag.msg("playSoundAudioContext started and then had a resuming error for " + tag + ". Error: " + e);
      });
    } else if (context.state === 'closed') {
      // ignore request to pause sound - it's already closed
      lowLag.msg("playSoundAudioContext failed to start, context closed for " + tag);
    } else {
      lowLag.msg("playSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
    }
  }

  this.pauseSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;
    var context = lowLag.audioContext;

    if (tag === undefined) {
      // ignore request to pause sound as nothing is currently playing
      lowLag.msg("pauseSoundAudioContext nothing to pause");
      return;
    }

    // find currently playing (running) audio and pause it (suspend)
    if (context !== undefined) {
      // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
      if (context.state === 'running') {
      	lowLag.msg("pauseSoundAudioContext " + tag);
        context.suspend().then(function() {
          lowLag.msg("pauseSoundAudioContext suspended " + tag);
        }).catch(function(e) {
          lowLag.msg("pauseSoundAudioContext suspend error for " + tag + ". Error: " + e);
        });
      } else if (context.state === 'suspended') {
        // ignore request to pause sound - it's already suspended
        lowLag.msg("pauseSoundAudioContext already suspended " + tag);
      } else if (context.state === 'closed') {
        // ignore request to pause sound - it's already closed
        lowLag.msg("pauseSoundAudioContext already closed " + tag);
      } else {
        lowLag.msg("pauseSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
      }
    }
  }

  this.stopSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;

    if (tag === undefined) {
      // ignore request to stop sound as nothing is currently playing
      lowLag.msg("stopSoundAudioContext nothing to stop");
      return;
    } else {
      lowLag.msg("stopSoundAudioContext " + tag);
    }

    // find current playing audio and stop it
    var source = lowLag.audioBufferSources[tag];
    if (source !== undefined) {
      if (typeof(source.noteOff) == "function") {
        source.noteOff(0);
      } else {
        source.stop();
      }
      lowLag.msg("stopSoundAudioContext stopped " + tag);
      lowLag.audioBufferSources[tag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }
  }

  this.switchSoundAudioContext = function(autoplay) {
    lowLag.msg("switchSoundAudioContext " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

    if (lowLag.currentTag && lowLag.currentTag == 'audio1') {
      lowLag.currentTag = 'audio2';
    } else {
      lowLag.currentTag = 'audio1';
    }

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.changeSoundAudioContext = function(tag, autoplay) {
    lowLag.msg("changeSoundAudioContext to tag " + tag + " " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

		if(tag === undefined) {
    	lowLag.msg("changeSoundAudioContext tag is undefined");
    	return;
    }
    
    lowLag.currentTag = tag;

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.msg = function(m) {
    m = "-- lowLag " + m;
    console.log(m);
  }
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script>
<script>
  // AudioContext
  $(document).ready(function() {
    lowLag.init();
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3'], 'audio1');
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3'], 'audio2');
    // starts with audio1
    lowLag.changeSoundAudioContext('audio1', false);
  });

  // ----------------

  // Audio Element
  $(document).ready(function() {
    var $audioElement = $('#audioElement');
    var audioEl = $audioElement[0];
    var audioSources = {
      "audio1": "https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3",
      "audio2": "https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3"
    };
    playAudioElement = function() {
      audioEl.play();
    }
    pauseAudioElement = function() {
      audioEl.pause();
    }
    stopAudioElement = function() {
      audioEl.pause();
      audioEl.currentTime = 0;
    }
    switchAudioElement = function(autoplay) {
      var source = $audioElement.attr('data-source');

      if (source && source == 'audio1') {
        $audioElement.attr('src', audioSources.audio2);
        $audioElement.attr('data-source', 'audio2');
      } else {
        $audioElement.attr('src', audioSources.audio1);
        $audioElement.attr('data-source', 'audio1');
      }

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement = function(tag, autoplay) {
      var source = $audioElement.attr('data-source');
      
      if(tag === undefined || audioSources[tag] === undefined) {
      	return;
      }

      $audioElement.attr('src', audioSources[tag]);
      $audioElement.attr('data-source', tag);

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement('audio1', false); // starts with audio1
  });

</script>

<h1>
  AudioContext (<a href="https://developer.mozilla.org/en-US/docs/Web/API/AudioContext" target="blank">api</a>)
</h1>
<button onClick="lowLag.play();">Play</button>
<button onClick="lowLag.pause();">Pause</button>
<button onClick="lowLag.stop();">Stop</button>
<button onClick="lowLag.switch(true);">Swtich</button>
<button onClick="lowLag.change('audio1', true);">Play 1</button>
<button onClick="lowLag.change('audio2', true);">Play 2</button>

<hr>

<h1>
  Audio Element (<a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/audio" target="blank">api</a>)
</h1>
<audio id="audioElement" controls loop preload="auto" src="">
</audio>
<br>
<button onClick="playAudioElement();">Play</button>
<button onClick="pauseAudioElement();">Pause</button>
<button onClick="stopAudioElement();">Stop</button>
<button onClick="switchAudioElement(true);">Switch</button>
<button onClick="changeAudioElement('audio1', true);">Play 1</button>
<button onClick="changeAudioElement('audio2', true);">Play 2</button>

enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here

答案 7 :(得分:0)

不幸的是,要使其在Safari中正常运行的唯一方法是,我们需要使用WebAudio API或第三方库来处理此问题。在此处检查源代码(未压缩)
https://drums-set-js.herokuapp.com/index.html
https://drums-set-js.herokuapp.com/app.js

答案 8 :(得分:0)

相同的问题。我试图通过不同的方式预加载它。最后,我将动画逻辑包装到“播放”回调中。因此,仅当加载文件并开始播放时,此逻辑才应该起作用,但是结果是,我看到动画逻辑已经开始,并且音频播放延迟了大约2秒。 令我震惊的是,如果音频已被称为“播放”回调,它怎么会有延迟? enter image description here

音频上下文解决了我的问题。 我在这里找到的最简单的例子 https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer getData-准备音频文件; 那么您可以使用source.start(0);

播放它

此链接错过了如何获取audioCtx,您可以在此处复制 let audioCtx = new (window.AudioContext || window.webkitAudioContext)();

答案 9 :(得分:0)

我只需在点击时创建 <audio autoplay /> dom 元素,这适用于所有主要浏览器 - 无需手动处理事件和触发播放

如果您想手动响应音频状态更改 - 我建议监听 play 事件而不是 loadeddata - 它的行为在不同浏览器中更加一致