iOS合并三个视频 - 旋转中心视频

时间:2015-09-21 15:36:57

标签: ios objective-c video cgaffinetransform

我有三个视频。第一个来自后置摄像头。第二个是前置摄像头,第三个是后置摄像头。视频总是以横向模式拍摄,右侧是主页按钮。

背面视频方向正确。使用前置摄像头拍摄的中央视频旋转180度(上下颠倒)。我一直在研究并尝试多种方法来改造中心视频而没有运气。我每次都得到相同的结果。

我对整个过程感到非常沮丧。我在网上阅读的所有内容以及评论者的评论/建议都应该有效但不起作用。无论我尝试进行转换,视频都是一样的。它不断地表现得好像我没有应用任何转换。没有。我不明白为什么转换被忽略了。我已经花了好几个星期而且我到了最后 - 它根本不起作用。

以下是我的代码的当前迭代:

- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {

    AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];
    __block NSMutableArray *instructions = [[NSMutableArray alloc] init];
    __block CGSize size = CGSizeZero;
    __block CMTime time = kCMTimeZero;

    __block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];

    __block CGAffineTransform transformflip = CGAffineTransformMakeScale(1, -1);
    //    __block CGAffineTransform transformflip = CGAffineTransformMakeRotation(M_PI);

    __block int32_t commontimescale = 600;

    [assets enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

        NSURL *assetUrl = (NSURL *)obj;
        AVAsset *asset = [AVAsset assetWithURL:assetUrl];

        CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

        NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
        AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;

        NSError *error;
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];
        if (error) {
            NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
        }

        AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        CGAffineTransform transform = assetTrack.preferredTransform;
        [videoLayerInstruction setTransform:CGAffineTransformConcat(transform, transformflip) atTime:time];

        // the main instruction set - this is wrapping the time
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
        if (videoLayerInstruction != nil)
            videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
        [instructions addObject:videoCompositionInstruction];

        // time increment variables
        time = CMTimeAdd(time, cliptime);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = assetTrack.naturalSize;;
        }

    }];

    mutableVideoComposition.instructions = instructions;

    // set the frame rate to 9fps
    mutableVideoComposition.frameDuration = CMTimeMake(1, 12);

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths firstObject];
    int number = arc4random_uniform(10000);
    self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                      presetName:AVAssetExportPreset1280x720];

    exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;

    dispatch_group_t group = dispatch_group_create();

    dispatch_group_enter(group);

    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_group_leave(group);

    }];

    dispatch_group_notify(group, dispatch_get_main_queue(), ^{

        // get the size of the file
        unsigned  long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
        NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
        NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];

        NSLog(@"Export File (Final) - %@", self.outputFile);
        completion(thereturn);

    });

}

有任何想法或建议吗?

1 个答案:

答案 0 :(得分:3)

每个AVAssetTrack都有preferredTransform属性。它包含有关如何旋转和翻译视频以便正确显示的信息,因此您不必猜测。在每个图层指令中使用每个视频的preferredTransform。

不要设置" videoCompositionTrack.preferredTransform = ..."

删除转换渐变" [videoLayerInstruction setTransformRampFromStartTransform:..."

在该枚举中,只需使用:

CGAffineTransform transform = assetTrack.preferredTransform;
[videoLayerInstruction setTransform:transform atTime:time];

我假设您的视频的拍摄尺寸与输出相同,中间视频的宽度和高度相反。如果不是,您必须添加适当的缩放比例:

float scaleFactor = ...// i.e. (outputWidth / videoWidth) 
CGAffineTransform scale = CGAffineTransformMakeScale(scaleFactor,scaleFactor)
transform = CGAffineTransformConcat(transform, scale);
[videoLayerInstruction setTransform:transform atTime:time];

编辑:在合成中出现倒置的源视频似乎是颠倒了开始,但有一个身份CGAffineTransform。此代码用于以正确的方向显示它们:

- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {

        AVMutableComposition *mutableComposition = [AVMutableComposition composition];
        AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                           preferredTrackID:kCMPersistentTrackID_Invalid];
        __block NSMutableArray *instructions = [[NSMutableArray alloc] init];
        __block CMTime time = kCMTimeZero;
        __block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
        __block int32_t commontimescale = 600;

        // Create one layer instruction.  We have one video track, and there should be one layer instruction per video track.
        AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        [assets enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

            NSURL *assetUrl = (NSURL *)obj;
            AVAsset *asset = [AVAsset assetWithURL:assetUrl];

            CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

            NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
            AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
            CGSize naturalSize = assetTrack.naturalSize;

            NSError *error;
            //insert the video from the assetTrack into the composition track
            [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
                                           ofTrack:assetTrack
                                            atTime:time
                                             error:&error];
            if (error) {
                NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
            }


            CGAffineTransform transform = assetTrack.preferredTransform;

            //set the layer to have this videos transform at the time that this video starts
            if (<* the video is an intermediate video  - has the wrong orientation*>) {
                //these videos have the identity transform, yet they are upside down.
                //we need to rotate them by M_PI radians (180 degrees) and shift the video back into place

                CGAffineTransform rotateTransform = CGAffineTransformMakeRotation(M_PI);
                CGAffineTransform translateTransform = CGAffineTransformMakeTranslation(naturalSize.width, naturalSize.height);
                [videoLayerInstruction setTransform:CGAffineTransformConcat(rotateTransform, translateTransform) atTime:time];

            } else {
                [videoLayerInstruction setTransform:transform atTime:time];
            }

            // time increment variables
            time = CMTimeAdd(time, cliptime);

        }];

        // the main instruction set - this is wrapping the time
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,mutableComposition.duration); //make the instruction last for the entire composition
        videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
        [instructions addObject:videoCompositionInstruction];
        mutableVideoComposition.instructions = instructions;

        // set the frame rate to 9fps
        mutableVideoComposition.frameDuration = CMTimeMake(1, 12);

        //set the rendersize for the video we're about to write
        mutableVideoComposition.renderSize = CGSizeMake(1280,720);

        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths firstObject];
        int number = arc4random_uniform(10000);
        self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];

        //let the rendersize of the video composition dictate size.  use quality preset here
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                          presetName:AVAssetExportPresetHighestQuality];

        exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
        //Set the output file type
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
        exporter.shouldOptimizeForNetworkUse = YES;
        exporter.videoComposition = mutableVideoComposition;

        dispatch_group_t group = dispatch_group_create();

        dispatch_group_enter(group);

        [exporter exportAsynchronouslyWithCompletionHandler:^{
            dispatch_group_leave(group);

        }];

        dispatch_group_notify(group, dispatch_get_main_queue(), ^{

            // get the size of the file
            unsigned  long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
            NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
            NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];

            NSLog(@"Export File (Final) - %@", self.outputFile);
            completion(thereturn);

        });
    }