使用AVMutableVideoComposition

时间:2015-07-14 09:23:19

标签: ios objective-c video avfoundation avmutablecomposition

我正在尝试使用AVMutableComposition合并多个视频。我遇到的问题是,每当我尝试添加任何AVMutableVideoComposition来应用任何说明时,我的播放都会在AVPlayer中以6秒的精确时间冻结。

另一件有趣的事情是,如果我在使用相同的videoComposition导出后在iPad的照片应用中播放它,它会正常播放。那么为什么它会在AVPlayer 6秒内冻结?

代码:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];

    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];

    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];

    for (AVURLAsset *asset in assets)
    {
            AVAssetTrack *assetTrack;
            assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
            AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;


            NSError *error;
            [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
                                                   ofTrack:assetTrack
                                                    atTime:time
                                                     error:&error];

            if (error) {
                NSLog(@"asset url :: %@",assetTrack.asset);
                NSLog(@"Error1 - %@", error.debugDescription);
            }

            [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssetTrack.timeRange.duration)
                                               ofTrack:audioAssetTrack
                                                atTime:time
                                                 error:&error];

            if (error) {
                NSLog(@"Error2 - %@", error.debugDescription);
            }

            time = CMTimeAdd(time, assetTrack.timeRange.duration);

            if (CGSizeEqualToSize(size, CGSizeZero)) {
                size = assetTrack.naturalSize;;
            }
        }

        AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        AVMutableVideoCompositionLayerInstruction *videoTrackLayerInstruction =  [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
        mainInstruction.layerInstructions = [NSArray arrayWithObjects:videoTrackLayerInstruction, nil];
        AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
        mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
        mainCompositionInst.frameDuration = CMTimeMake(1, 30);
        mainCompositionInst.renderSize = size;

        pi = [AVPlayerItem playerItemWithAsset:mutableComposition];
        pi.videoComposition = mainCompositionInst;

此外,我知道问题主要是videoComposition,因为如果我删除了videoComposition,那么它在AVPlayer上播放正常。

更新1 :我刚刚发现,当它在6秒后冻结时,如果我向后或向前拖动滑块(即使用seekToTime),它会再次开始正常播放而不会进一步冻结。

即使视频被冻结,音频也会继续播放。

更新2 :如果我继续使用AVAssetExportSession并使用相同的AVMutableComposition导出它,并从导出的视频加载资源,则可以正常工作。所以当我直接玩AVMutableComposition时,问题就出现了。

1 个答案:

答案 0 :(得分:1)

最后,我得到了修复此问题的解决方案。

您应该在playerItem的状态更改为.ReadyToPlay后播放。

请参阅以下内容。

func startVideoPlayer() {
    let playerItem = AVPlayerItem(asset: self.composition!)
    playerItem.videoComposition = self.videoComposition!

    let player = AVPlayer(playerItem: playerItem)
    player.actionAtItemEnd = .None

    videoPlayerLayer = AVPlayerLayer(player: player)
    videoPlayerLayer!.frame = self.bounds

    /* add playerItem's observer */
    player.addObserver(self, forKeyPath: "player.currentItem.status", options: .New, context: nil)

    NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem);

    self.layer.addSublayer(videoPlayerLayer!)
}

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
    if keyPath != nil && keyPath! == "player.currentItem.status" {
        if let newValue = change?[NSKeyValueChangeNewKey] {
            if AVPlayerStatus(rawValue: newValue as! Int) == .ReadyToPlay {
                playVideo() /* play after status is changed to .ReadyToPlay */
            }
        }
    } else {
        super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
    }
}    

func playerItemDidReachEnd(notification: NSNotification) {
    let playerItem = notification.object as! AVPlayerItem
    playerItem.seekToTime(kCMTimeZero)

    playVideo()
} 

func playVideo() {
    videoPlayerLayer?.player!.play()
}