使用AVMutableComposition

时间:2015-05-21 10:54:50

标签: ios objective-c video avfoundation avmutablecomposition

以前曾多次询问过这个问题,但没有任何帮助。我正在使用AVMutableComposition合并多个视频。合并视频后,我会在30%到40%的视频中出现空白帧。其他合并很好。我只是使用AVPlayer作为AVPlayerItem直接播放作品。代码如下:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];

    NSMutableArray *instructions = [NSMutableArray new];
    CGSize size = CGSizeZero;

    CMTime time = kCMTimeZero;
    for (AVURLAsset *asset in assets)
    {
        AVAssetTrack *assetTrack;
        assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;

        NSError *error;
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];


        if (error) {
            NSLog(@"asset url :: %@",assetTrack.asset);
            NSLog(@"Error - %@", error.debugDescription);
        }

        [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
                                       ofTrack:audioAssetTrack
                                        atTime:time
                                         error:&error];


        if (error) {
            NSLog(@"Error - %@", error.debugDescription);
        }
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
        videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]];
        [instructions addObject:videoCompositionInstruction];

        time = CMTimeAdd(time, assetTrack.timeRange.duration);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = assetTrack.naturalSize;;
        }
    }

    AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
    mutableVideoComposition.instructions = instructions;
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
    mutableVideoComposition.renderSize = size;

    playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition];
    playerItem.videoComposition = mutableVideoComposition;

1 个答案:

答案 0 :(得分:1)

据我所知,AVMutableVideoCompositionLayerInstruction不能简单地附加"或者"添加"作为你的代码方式。

从您的代码中,我想您想在合并视频资源时保留视频说明信息,但不能复制说明"直。

如果您想这样做,请参阅AVVideoCompositionLayerInstruction的文档,例如

    getTransformRampForTime:startTransform:endTransform:timeRange:
    setTransformRampFromStartTransform:toEndTransform:timeRange:
    setTransform:atTime:

    getOpacityRampForTime:startOpacity:endOpacity:timeRange:
    setOpacityRampFromStartOpacity:toEndOpacity:timeRange:
    setOpacity:atTime:

    getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange:
    setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange:
    setCropRectangle:atTime:

您应该在源跟踪上使用getFoo...方法,然后为最终曲目计算insertTimetimeRange,然后setFoo...,然后附加到最终的layerInstructions videoComposition。

是的,有点复杂......此外,最重要的是,您无法获得应用于源资源的所有视频效果。

那么你的目的是什么?您支持的源资产是什么?

如果您只想合并一些mp4 / mov文件,只需循环跟踪并将其追加到AVMutableCompositionTrack,不要videoComposition。我测试了你的代码,它有效。

如果要将AVAssets与视频指令合并,请参阅上述说明和docs。我最好的做法是, 在合并之前,使用AVAssetExportSession将这些AVAsset保存到文件,然后合并视频文件。

P.S。也许您的测试文件或源资产存在一些问题。

我的项目代码如Vine:

    - (BOOL)generateComposition
    {
            [self cleanComposition];

            NSUInteger segmentsCount = self.segmentsCount;
            if (0 == segmentsCount) {
                    return NO;
            }

            AVMutableComposition *composition = [AVMutableComposition composition];
            AVMutableVideoComposition *videoComposition = nil;
            AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil;
            AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil;
            AVMutableAudioMix *audioMix = nil;

            AVMutableCompositionTrack *videoTrack = nil;
            AVMutableCompositionTrack *audioTrack = nil;
            AVMutableCompositionTrack *musicTrack = nil;
            CMTime currentTime = kCMTimeZero;

            for (MVRecorderSegment *segment in self.segments) {
                    AVURLAsset *asset = segment.asset;
                    NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
                    NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio];

                    CMTime maxBounds = kCMTimeInvalid;

                    CMTime videoTime = currentTime;
                    for (AVAssetTrack *videoAssetTrack in videoAssetTracks) {
                            if (!videoTrack) {
                                    videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
                                    videoTrack.preferredTransform = CGAffineTransformIdentity;

                                    videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
                                    videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
                            }

                            /* Fix orientation */
                            CGAffineTransform transform = videoAssetTrack.preferredTransform;
                            if (AVCaptureDevicePositionFront == segment.cameraPosition) {
                                    transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);
                                    transform = CGAffineTransformScale(transform, -1.0, 1.0);
                            } else if (AVCaptureDevicePositionBack == segment.cameraPosition) {

                            }
                            [videoCompositionLayerInstruction setTransform:transform atTime:videoTime];

                            /* Append track */
                            videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds];
                            maxBounds = videoTime;
                    }

                    if (self.sessionConfiguration.originalVoiceOn) {
                            CMTime audioTime = currentTime;
                            for (AVAssetTrack *audioAssetTrack in audioAssetTracks) {
                                    if (!audioTrack) {
                                            audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
                                    }
                                    audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds];
                            }
                    }

                    currentTime = composition.duration;
            }

            if (videoCompositionInstruction && videoCompositionLayerInstruction) {
                    videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
                    videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];

                    videoComposition = [AVMutableVideoComposition videoComposition];
                    videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);
                    videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);
                    videoComposition.instructions = @[videoCompositionInstruction];
            }


            // 添加背景音乐 musicTrack
            NSURL *musicFileURL = self.sessionConfiguration.musicFileURL;
            if (musicFileURL && musicFileURL.isFileExists) {
                    AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL];
                    AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
                    if (musicAssetTrack) {
                            musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
                            if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) {
                                    // 如果背景音乐时长大于视频总时长, 则直接添加
                                    [musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL];
                            } else {
                                    // 否则, 循环背景音乐
                                    CMTime musicTime = kCMTimeZero;
                                    CMTime bounds = composition.duration;
                                    while (true) {
                                            musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds];
                                            if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) {
                                                    break;
                                            }
                                    }
                            }
                    }
            }

            // 处理音频
            if (musicTrack) {
                    AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack];

                    /* 背景音乐添加淡入淡出 */
                    AVAsset *musicAsset = musicTrack.asset;
                    CMTime crossfadeDuration = CMTimeMake(15, 10); // 前后都是1.5秒
                    CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5);
                    crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration);
                    CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration);
                    CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration);
                    [audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin];
                    [audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd];

                    audioMix = [AVMutableAudioMix audioMix];
                    [audioMix setInputParameters:@[audioMixParameters]];
            }

            _composition = composition;
            _videoComposition = videoComposition;
            _audioMix = audioMix;

            return YES;
    }


    - (AVPlayerItem *)playerItem
    {
            AVPlayerItem *playerItem = nil;
            if (self.composition) {
                    playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
                    if (!self.videoComposition.animationTool) {
                            playerItem.videoComposition = self.videoComposition;
                    }
                    playerItem.audioMix = self.audioMix;
            }
            return playerItem;
    }

    ///=============================================
    /// MVHelper
    ///=============================================

    + (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds
    {
            CMTimeRange timeRange = track.timeRange;
            atTime = CMTimeAdd(atTime, timeRange.start);

            if (!track || !compositionTrack) {
                    return atTime;
            }

            if (CMTIME_IS_VALID(bounds)) {
                    CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration);
                    if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) {
                            timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds)));
                    }
            }
            if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) {
                    NSError *error = nil;
                    [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];
                    if (error) {
                            MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error);
                    }
                    return CMTimeAdd(atTime, timeRange.duration);
            }

            return atTime;
    }