使用AVAudioPlayer
后,currentPosition
用于获取&设置位置,但在AVAudioEngine
&使用AVAudioPlayerNode
?
答案 0 :(得分:5)
AVAudioEngine比AVAudioPlayer更复杂。因此,要查找当前位置,必须使用sampleTime / sampleRate。 sampleTime位于AVAudioPlayerNode的lastRenderTime中,sampleRate位于AVAudioFile的fileFormat中。
此外,如果您想设置currentPosition,则需要使用AVAudioPCMBuffer将数据流式传输到播放器中。然后使用设置AVAudioFile的framePosition属性来向前移动文件指针。你必须将它与AVAudioPlayerNode的playAtTime:函数结合使用来设置玩家的当前时间。
编辑:由于从GitHub中删除了链接
答案 1 :(得分:5)
考虑到这个主题的用处,我想分享我的答案。
与AVAudioEngine一起使用的播放器是AVAudioPlayerNode。
您需要什么:
@property (weak,nonatomic) AVAudioPlayerNode *player;
@property (weak,nonatomic) AVAudioFile *file;
AVAudioFramePosition songLengthSamples;
float sampleRateSong;
float lengthSongSeconds;
float startInSongSeconds;
开始播放文件后
lengthSongSamples = self.file.length;
AVAudioFormat *songFormat = self.file.processingFormat;
sampleRateSong=songFormat.sampleRate;
lengthSongSeconds=lengthSongSamples/sampleRateSong;
在阅读文件时获取AVAudioPlayerNode的播放位置:
if(self.player.isPlaying){
AVAudioTime *nodeTime=self.player.lastRenderTime;
AVAudioTime *playerTime=[self.player playerTimeForNodeTime:nodeTime];
float elapsedSeconds=startInSongSeconds+((double)playerTime.sampleTime/sampleRateSong);
NSLog(@"Elapsed seconds: %f",elapsedSeconds);
}
在阅读文件时设置AVAudioPlayerNode的播放位置:
[self.player stop];
startInSongSeconds=12.5; // example
unsigned long int startSample = (long int)floor(startInSongSeconds*sampleRateSong);
unsigned long int lengthSamples = songLengthSamples-startSample;
[self.player scheduleSegment:self.file startingFrame:startSample frameCount:(AVAudioFrameCount)lengthSamples atTime:nil completionHandler:^{
// do something (pause player)
}];
[self.player play];
答案 2 :(得分:3)
如果有人急需这个,我将danyadd的答案转换为Swift 3并制作了一个简单的玩家类。
class EasyPlayer {
var engine: AVAudioEngine!
var player: AVAudioPlayerNode!
var audioFile : AVAudioFile!
var songLengthSamples: AVAudioFramePosition!
var sampleRateSong: Float = 0
var lengthSongSeconds: Float = 0
var startInSongSeconds: Float = 0
let pitch : AVAudioUnitTimePitch
init() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
player.volume = 1.0
let path = Bundle.main.path(forResource: "filename", ofType: "mp3")!
let url = NSURL.fileURL(withPath: path)
audioFile = try? AVAudioFile(forReading: url)
songLengthSamples = audioFile.length
let songFormat = audioFile.processingFormat
sampleRateSong = Float(songFormat.sampleRate)
lengthSongSeconds = Float(songLengthSamples) / sampleRateSong
let buffer = AVAudioPCMBuffer(pcmFormat: audioFile!.processingFormat, frameCapacity: AVAudioFrameCount(audioFile!.length))
do {
try audioFile!.read(into: buffer)
} catch _ {
}
pitch = AVAudioUnitTimePitch()
pitch.pitch = 1
pitch.rate = 1
engine.attach(player)
engine.attach(pitch)
engine.connect(player, to: pitch, format: buffer.format)
engine.connect(pitch, to: engine.mainMixerNode, format: buffer.format)
player.scheduleBuffer(buffer, at: nil, options: AVAudioPlayerNodeBufferOptions.loops, completionHandler: nil)
engine.prepare()
do {
try engine.start()
} catch _ {
}
}
func setPitch(_ pitch: Float) {
self.pitch.pitch = pitch
}
func play() {
player.play()
}
func pause() {
player.pause()
}
func getCurrentPosition() -> Float {
if(self.player.isPlaying){
if let nodeTime = self.player.lastRenderTime, let playerTime = player.playerTime(forNodeTime: nodeTime) {
let elapsedSeconds = startInSongSeconds + (Float(playerTime.sampleTime) / Float(sampleRateSong))
print("Elapsed seconds: \(elapsedSeconds)")
return elapsedSeconds
}
}
return 0
}
func seekTo(time: Float) {
player.stop()
let startSample = floor(time * sampleRateSong)
let lengthSamples = Float(songLengthSamples) - startSample
player.scheduleSegment(audioFile, startingFrame: AVAudioFramePosition(startSample), frameCount: AVAudioFrameCount(lengthSamples), at: nil, completionHandler: {self.player.pause()})
player.play()
}
}