我正在尝试采用不同的方法来合并视频。我正在为每次转型创建一个新的轨道。
此代码的问题在于显示第一个视频,而其他所有视频都是黑色。
音频叠加层对整个片段都是正确的。看起来视频没有被带入合成中,因为当文件大小应该是大约25M时,文件的大小是5M。 5M大小与第一个剪辑和音轨的大小相关。所有AVAsset似乎都是有效的。文件确实存在于文件系统上。这是代码:
- (void)mergeVideos:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion; {
// NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime currentstarttime = kCMTimeZero;
int tracknumber = 1;
int32_t commontimescale = 600;
CMTime time = kCMTimeZero;
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
NSMutableArray *instructions = [[NSMutableArray alloc] init];
for (NSURL *assetUrl in assets) {
AVAsset *asset = [AVAsset assetWithURL:assetUrl];
NSLog(@"Number of tracks: %lu Incremental track number %i", (unsigned long)[[asset tracks] count], tracknumber);
// make sure the timescales are correct for these tracks
CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
NSLog(@"Running time: value = %lld timescale = %d", time.value, time.timescale);
NSLog(@"Asset length: value = %lld timescale = %d", asset.duration.value, asset.duration.timescale);
NSLog(@"Converted Scale: value = %lld timescale = %d", cliptime.value, cliptime.timescale);
NSError *error;
[videoCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, time)];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(time, cliptime)
ofTrack:assetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"Error - %@", error.debugDescription);
}
// this flips the video temporarily for the front facing camera
AVMutableVideoCompositionLayerInstruction *inst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
// set the flipping trasform to the correct tracks
if ((tracknumber == 2) || (tracknumber == 4) || (tracknumber == 6) || (tracknumber == 8) || (tracknumber == 10)) {
CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI);
[inst setTransform:transform atTime:time];
} else {
CGAffineTransform transform = assetTrack.preferredTransform;
[inst setTransform:transform atTime:time];
}
// don't block the other videos with your black - needs to be the incremental time
[inst setOpacity:0.0 atTime:time];
// add the instructions to the overall array
[instructions addObject:inst];
// increment the total time after w use it for this iteration
time = CMTimeAdd(time, cliptime);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject.naturalSize;;
}
// incrememt the track counter
tracknumber++;
}
AVMutableVideoCompositionInstruction *mainVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
mainVideoCompositionInstruction.layerInstructions = instructions;
// bring all of the video together in the main composition
AVMutableVideoComposition *mainVideoComposition = [AVMutableVideoComposition videoComposition];
mainVideoComposition.instructions = [NSArray arrayWithObject:mainVideoCompositionInstruction];
// setup the audio
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// Grab the path, make sure to add it to your project!
NSURL *soundURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"bink-bink-lexus-3" ofType:@"aif"]];
AVURLAsset *soundAsset = [AVURLAsset assetWithURL:soundURL];
NSError *error;
// add audio to the entire track
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)
ofTrack:[soundAsset tracksWithMediaType:AVMediaTypeAudio][0]
atTime:kCMTimeZero
error:&error];
// Set the frame duration to an appropriate value (i.e. 30 frames per second for video).
// mainVideoComposition.frameDuration = CMTimeMake(1, 30);
mainVideoComposition.renderSize = size;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths firstObject];
int number = arc4random_uniform(10000);
self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
//Set the output file type
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_group_leave(group);
}];
dispatch_group_notify(group, dispatch_get_main_queue(), ^{
NSLog(@"Export File (Final) - %@", self.outputFile);
completion(self.outputFile);
});
}
答案 0 :(得分:1)
您的问题是,通过使用多个AVMutableCompositionTracks并在kCMTimeZero之后一次插入一个时间范围,您将导致每个后续轨道的媒体出现在kCMTimeZero的合成中。如果你想追求这条路线,你需要使用insertEmptyTimeRange:。它将按照您插入的空范围的持续时间向前移动该特定轨道的媒体。
或者,更简单的方法是使用单个AVMutableCompositionTrack。
答案 1 :(得分:0)
请参阅此帖:iOS Combine three videos - rotate the center video
这篇文章展示了如何使用单曲而不是多曲目。