我打算在我的iOS应用中重构我的录音系统。 语境: 到目前为止,我分别录制视频和音频,同时开始大致录制。一旦完成记录,同样的系统,我分别播放视频和音频,在音频上动态应用AudioUnits。最后,我合并了视频和修改过的音频。 碰巧两个记录不会同时启动(出于任何原因),产生不同步的结果。
是否可以像这样重构我的系统:
1) Record normal video with audio into mov file --> I would be sure that audio+video would be synchronized.
2) During viewing the result with AVPlayer, process the audio part on the fly. (I will use AudioKit) --> that's the part I m not confident.
Would I be able to send the audio buffer to Audiokit (which would process it) and give back the processed audio to AVPlayer like if it was the original AVPlayer audio part?
3) Save a final file with video and audio modified --> easy part with AVFundation
请询问任何信息;)
答案 0 :(得分:6)
我可以想到一个相当简单的方法来做到这一点。
基本上你只需要在AKPlayer实例中打开你的视频文件。然后,您将视频音频静音。现在,您在AudioKit中拥有视频音频。使用公共时钟将视频和音频锁定在一起非常简单。流程的伪代码:
// This will represent a common clock using the host time
var audioClock = CMClockGetHostTimeClock()
// your video player
let videoPlayer = AVPlayer( url: videoURL )
videoPlayer.masterClock = audioClock
videoPlayer.automaticallyWaitsToMinimizeStalling = false
....
var audioPlayer: AKPlayer?
// your video-audio player
if let player = try? AKPlayer(url: videoURL) {
audioPlayer = player
}
func schedulePlayback(videoTime: TimeInterval, audioTime: TimeInterval, hostTime: UInt64 ) {
audioPlay( audioTime, hostTime: hostTime )
videoPlay(at: 0, hostTime: hostTime)
}
func audioPlay(at time: TimeInterval = 0, hostTime: UInt64 = 0) {
audioPlayer.play(when: time, hostTime: hostTime)
}
func videoPlay(at time: TimeInterval = 0, hostTime: UInt64 = 0 ) {
let cmHostTime = CMClockMakeHostTimeFromSystemUnits(hostTime)
let cmVTime = CMTimeMakeWithSeconds(time, 1000000)
let futureTime = CMTimeAdd(cmHostTime, cmVTime)
videoPlayer.setRate(1, time: kCMTimeInvalid, atHostTime: futureTime)
}
您可以正常方式将播放器连接到任何AudioKit处理链。
如果要导出音频,请在最终输出处理链上运行AKNodeRecorder。将此记录到文件中,然后将您的音频合并到您的视频中。我不确定正在处理的AudioKit离线处理是否已准备就绪,因此您可能需要实时播放音频以捕获处理输出。