我正在尝试将.mov视频转换为.mp4,同时更正方向。
我在下面使用的代码在使用UIImagePickerController
录制视频时非常有用,但是如果从相机胶卷中选择了视频,我会收到此错误,我不明白为什么:
导出失败:操作已停止:错误 Domain = AVFoundationErrorDomain Code = -11841“Operation Stopped” UserInfo = 0x1815ca50 {NSLocalizedDescription =操作已停止, NSLocalizedFailureReason =视频无法合成。}
我先尝试将视频保存到另一个文件,但没有任何区别。
以下是我用来转换视频的代码:
- (void)convertVideoToLowQuailtyAndFixRotationWithInputURL:(NSURL*)inputURL handler:(void (^)(NSURL *outURL))handler
{
if ([[inputURL pathExtension] isEqualToString:@"MOV"])
{
NSURL *outputURL = [inputURL URLByDeletingPathExtension];
outputURL = [outputURL URLByAppendingPathExtension:@"mp4"];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetTrack *sourceVideoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceAudioTrack
atTime:kCMTimeZero error:nil];
AVMutableVideoComposition *videoComposition = [self getVideoComposition:avAsset];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@ : %@", [[exportSession error] localizedDescription], [exportSession error]);
handler(nil);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export canceled");
handler(nil);
break;
default:
handler(outputURL);
break;
}
}];
}
} else {
handler(inputURL);
}
}
- (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize videoSize = videoTrack.naturalSize;
BOOL isPortrait_ = [self isVideoPortrait:asset];
if(isPortrait_) {
// NSLog(@"video is portrait ");
videoSize = CGSizeMake(videoSize.height, videoSize.width);
}
composition.naturalSize = videoSize;
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);
AVMutableCompositionTrack *compositionVideoTrack;
compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
inst.layerInstructions = [NSArray arrayWithObject:layerInst];
videoComposition.instructions = [NSArray arrayWithObject:inst];
return videoComposition;
}
答案 0 :(得分:13)
AVFoundation Error Constant -11841表示您的视频合成无效。如果您想了解有关错误常数的更多信息,请参阅此链接: https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundation_ErrorConstants/Reference/reference.html
虽然我没有立即出现重大错误,但我可以建议以下方法来缩小您的问题来源。
首先,不要在这些调用中传递nil
error
参数:
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
创建一个NSError
对象并将引用传递给它,如下所示:
NSError *error = nil;
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:&error];
检查错误以确保您的视频和音轨正确插入合成轨道。如果一切顺利,错误应为nil
。
if(error)
NSLog(@"Insertion error: %@", error);
您可能还想检查AVAsset的composable
和exportable
以及hasProtectedContent
属性。如果这些分别不是YES,YES和NO,则可能在创建新视频文件时出现问题。
我偶尔会遇到一个问题,即在录制视频轨道的合成中使用时,创建音轨的时间范围并不像600时标。您可能希望在
中为持续时间(avAsset.duration)创建新的CMTimeCMTimeRangeMake(kCMTimeZero, avAsset.duration)
仅用于插入音轨。在新的CMTime中,使用44100的时间刻度(或音轨的采样率。)videoComposition.frameDuration
同样如此。根据视频轨道的nominalFrameRate
,您的时间可能无法正确显示600时间刻度。
最后,Apple提供了一个有用的工具来调试视频合成:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
它可以直观地展示你的作品,你可以看到它们看起来不应该的样子。
答案 1 :(得分:2)
尝试评论以下行并运行您的项目
exportSession.videoComposition = videoComposition;
答案 2 :(得分:1)
你绝对应该使用AVVideoCompostion的方法isValidForAsset:timeRange:validationDelegate:它将诊断你的视频合成的任何问题。 我有同样的问题,我的解决方案是使用AVMutableCompositionTrack而不是原始轨道创建layerInstruction:
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];