无缝连接AVAsets

时间:2019-02-23 09:29:13

标签: swift macos avfoundation

我有一些简单的AVFoundation代码可以将一堆四秒钟长的mp4文件连接在一起,如下所示:

func
compose(parts inParts: [Part], progress inProgress: (CMTime) -> ())
    -> AVAsset?
{
    guard
        let composition = self.composition,
        let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid),
        let audioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
    else
    {
        debugLog("Unable to create tracks for composition")
        return nil
    }

    do
    {
        var time = CMTime.zero
        for p in inParts
        {
            let asset = AVURLAsset(url: p.path.url)
            if let track = asset.tracks(withMediaType: .video).first
            {
                try videoTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
            }
            if let track = asset.tracks(withMediaType: .audio).first
            {
                try audioTrack.insertTimeRange(CMTimeRange(start: .zero, duration: asset.duration), of: track, at: time)
            }

            time = CMTimeAdd(time, asset.duration)
            inProgress(time)
        }
    }

    catch (let e)
    {
        debugLog("Error adding clips: \(e)")
        return nil
    }

    return composition
}

不幸的是,您每隔四秒钟就可以听到一小段声音,这对我来说并不是完全无缝的连接。我可以做些什么来改善这一点?

解决方案

感谢下面的NoHalfBits出色的回答,我用以下内容更新了上面的循环,并且效果很好:

        for p in inParts
        {
            let asset = AVURLAsset(url: p.path.url)

            //  It’s possible (and turns out, it’s often the case with UniFi NVR recordings)
            //  for the audio and video tracks to be of slightly different start time
            //  and duration. Find the intersection of the two tracks’ time ranges and
            //  use that range when inserting both tracks into the composition…

            //  Calculate the common time range between the video and audio tracks…

            let sourceVideo = asset.tracks(withMediaType: .video).first
            let sourceAudio = asset.tracks(withMediaType: .audio).first
            var commonTimeRange = CMTimeRange.zero
            if sourceVideo != nil && sourceAudio != nil
            {
                commonTimeRange = CMTimeRangeGetIntersection(sourceVideo!.timeRange, otherRange: sourceAudio!.timeRange)
            }
            else if sourceVideo != nil
            {
                commonTimeRange = sourceVideo!.timeRange
            }
            else if sourceAudio != nil
            {
                commonTimeRange = sourceAudio!.timeRange
            }
            else
            {
                //  There’s neither video nor audio tracks, bail…

                continue
            }

            debugLog("Asset duration: \(asset.duration.seconds), common time range duration: \(commonTimeRange.duration.seconds)")

            //  Insert the video and audio tracks…

            if sourceVideo != nil
            {
                try videoTrack.insertTimeRange(commonTimeRange, of: sourceVideo!, at: time)
            }
            if sourceAudio != nil
            {
                try audioTrack.insertTimeRange(commonTimeRange, of: sourceAudio!, at: time)
            }

            time = time + commonTimeRange.duration
            inProgress(time)
        }

1 个答案:

答案 0 :(得分:1)

在mp4容器中,每个曲目都可以有自己的开始时间和持续时间。尤其是在录制的材料中,音频和视频轨道的时间范围略有不同(在CMTimeRangeShow(track.timeRange)附近插入一些insertTimeRange可以看到这一点)并不少见。

要解决此问题,请不要盲目地从CMTime.zero和整个资产的持续时间(所有曲目的最大结束时间)插入

  • 获取源音频和视频轨道的timeRange
  • 从中计算出公共时间范围(CMTimeRangeGetIntersection为您完成此操作)
  • 将片段从源轨道插入到目标轨道时,请使用公共时间范围
  • time增大普通时间范围内的时间