iOS AVFoundation:设置视频的方向

时间:2012-11-20 02:29:47

标签: ios video avfoundation quicktime

我一直在努力解决在iOS设备上捕获期间和之后控制视频方向的问题。感谢Apple以前的答案和文档,我已经能够弄明白了。但是,现在我想将一些视频推送到网站,我遇到了特殊的问题。我特别概述了这个问题in this question,而the proposed solution原来要求在视频编码期间设置方向选项。

这可能是,但我不知道如何去做这件事。有关设置方向的文档是关于正确设置以在设备上显示,我已实现了建议found here.但是,此建议并未解决为非Apple软件(如VLC)正确设置方向的问题。或Chrome浏览器。

任何人都可以深入了解如何在设备上正确设置方向,以便正确显示所有查看软件吗?

5 个答案:

答案 0 :(得分:10)

最后,基于@Aaron Vegh和@Prince的答案,我想出了我的决议: //转换视频

+(void)convertMOVToMp4:(NSString *)movFilePath completion:(void (^)(NSString *mp4FilePath))block{


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:movFilePath]  options:nil];

AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVMutableComposition* composition = [AVMutableComposition composition];


AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                            preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceAudioTrack
                                atTime:kCMTimeZero error:nil];




AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                      presetName:AVAssetExportPresetMediumQuality];


NSString *exportPath =  [movFilePath stringByReplacingOccurrencesOfString:@".MOV" withString:@".mp4"];


NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];


assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = [self getVideoComposition:videoAsset composition:composition];

[assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (assetExport.status)
     {
         case AVAssetExportSessionStatusCompleted:
             //                export complete
                    if (block) {
                         block(exportPath);
                }
             break;
         case AVAssetExportSessionStatusFailed:
             block(nil);
             break;
         case AVAssetExportSessionStatusCancelled:
            block(nil);
             break;
     }
 }];
}

//获取当前方向

  +(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset composition:( AVMutableComposition*)composition{
    BOOL isPortrait_ = [self isVideoPortrait:asset];


    AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];


    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];

    AVMutableVideoCompositionLayerInstruction *layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];

    CGAffineTransform transform = videoTrack.preferredTransform;
    [layerInst setTransform:transform atTime:kCMTimeZero];


    AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
    inst.layerInstructions = [NSArray arrayWithObject:layerInst];


    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.instructions = [NSArray arrayWithObject:inst];

    CGSize videoSize = videoTrack.naturalSize;
    if(isPortrait_) {
        NSLog(@"video is portrait ");
        videoSize = CGSizeMake(videoSize.height, videoSize.width);
    }
    videoComposition.renderSize = videoSize;
    videoComposition.frameDuration = CMTimeMake(1,30);
    videoComposition.renderScale = 1.0;
    return videoComposition;
   }

//获取视频

+(BOOL) isVideoPortrait:(AVAsset *)asset{
BOOL isPortrait = FALSE;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks    count] > 0) {
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0];

    CGAffineTransform t = videoTrack.preferredTransform;
    // Portrait
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
    {
        isPortrait = YES;
    }
    // PortraitUpsideDown
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

        isPortrait = YES;
    }
    // LandscapeRight
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
    {
        isPortrait = FALSE;
    }
    // LandscapeLeft
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
    {
        isPortrait = FALSE;
    }
}
return isPortrait;

}

答案 1 :(得分:5)

在Apple的文档here中,它声明:

  

客户端现在可以在其AVCaptureVideoDataOutput -captureOutput:didOutputSampleBuffer:fromConnection:delegate callback中接收物理旋转的CVPixelBuffers。在以前的iOS版本中,前置摄像头将始终在AVCaptureVideoOrientationLandscapeLeft中提供缓冲区,而后置摄像头将始终在AVCaptureVideoOrientationLandscapeRight中提供缓冲区。支持所有4个AVCaptureVideoOrientations,并且旋转是硬件加速的。要请求缓冲区旋转,客户端在AVCaptureVideoDataOutput的视频AVCaptureConnection上调用-setVideoOrientation :.请注意,物理旋转缓冲区确实会带来性能成本,因此只有在必要时才请求轮换。例如,如果您希望使用AVAssetWriter将旋转视频写入QuickTime影片文件,则最好在AVAssetWriterInput上设置-transform属性,而不是在AVCaptureVideoDataOutput中物理旋转缓冲区。

所以Aaron Vegh发布的使用AVAssetExportSession的解决方案有效,但不需要。就像苹果医生说的那样,如果您希望正确设置方向以便在非苹果快速播放器(如VLC)或使用Chrome在网络上播放,则必须在AVCaptureConnection上为AVCaptureVideoDataOutput设置视频方向。如果您尝试为AVAssetWriterInput设置它,您将获得VLC和Chrome等播放器的错误方向。

这是我在设置捕获会话期间设置的代码:

// DECLARED AS PROPERTIES ABOVE
@property (strong,nonatomic) AVCaptureDeviceInput *audioIn;
@property (strong,nonatomic) AVCaptureAudioDataOutput *audioOut;
@property (strong,nonatomic) AVCaptureDeviceInput *videoIn;
@property (strong,nonatomic) AVCaptureVideoDataOutput *videoOut;
@property (strong,nonatomic) AVCaptureConnection *audioConnection;
@property (strong,nonatomic) AVCaptureConnection *videoConnection;
------------------------------------------------------------------
------------------------------------------------------------------

-(void)setupCaptureSession{
// Setup Session
self.session = [[AVCaptureSession alloc]init];
[self.session setSessionPreset:AVCaptureSessionPreset640x480];

// Create Audio connection ----------------------------------------
self.audioIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self getAudioDevice] error:nil];
if ([self.session canAddInput:self.audioIn]) {
    [self.session addInput:self.audioIn];
}

self.audioOut = [[AVCaptureAudioDataOutput alloc]init];
dispatch_queue_t audioCaptureQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL);
[self.audioOut setSampleBufferDelegate:self queue:audioCaptureQueue];
if ([self.session canAddOutput:self.audioOut]) {
    [self.session addOutput:self.audioOut];
}
self.audioConnection = [self.audioOut connectionWithMediaType:AVMediaTypeAudio];

// Create Video connection ----------------------------------------
self.videoIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
if ([self.session canAddInput:self.videoIn]) {
    [self.session addInput:self.videoIn];
}

self.videoOut = [[AVCaptureVideoDataOutput alloc]init];
[self.videoOut setAlwaysDiscardsLateVideoFrames:NO];
[self.videoOut setVideoSettings:nil];
dispatch_queue_t videoCaptureQueue =  dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[self.videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([self.session canAddOutput:self.videoOut]) {
    [self.session addOutput:self.videoOut];
}

self.videoConnection = [self.videoOut connectionWithMediaType:AVMediaTypeVideo];
// SET THE ORIENTATION HERE -------------------------------------------------
[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
// --------------------------------------------------------------------------

// Create Preview Layer -------------------------------------------
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
CGRect bounds = self.videoView.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.bounds = bounds;
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.videoView.layer addSublayer:previewLayer];

// Start session
[self.session startRunning];

}

答案 2 :(得分:3)

如果其他人也在寻找这个答案,这就是我编写的方法(稍作修改以简化):

- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL
{
CGAffineTransform rotationTransform;
CGAffineTransform rotateTranslate;
CGSize renderSize;

switch (self.recordingOrientation)
{
    // set these 3 values based on orientation

}


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil];

AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVMutableComposition* composition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceVideoTrack
                                atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                            preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                               ofTrack:sourceAudioTrack
                                atTime:kCMTimeZero error:nil];



AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = renderSize;
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];

AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                      presetName:AVAssetExportPresetMediumQuality];

NSString* videoName = @"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];

NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = videoComposition;

[assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (assetExport.status)
     {
         case AVAssetExportSessionStatusCompleted:
             //                export complete
             NSLog(@"Export Complete");
             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@"Export Failed");
             NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
             //                export error (see exportSession.error)
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@"Export Failed");
             NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
             //                export cancelled
             break;
     }
 }];

}

遗憾的是,这些内容很难记录,但是通过将示例与其他SO问题串联起来并阅读头文件,我能够将其工作。希望这有助于其他任何人!

答案 3 :(得分:2)

根据method

中的correct orientation video,使用以下asset设置orientation AVMutableVideoComposition
-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset
{
  AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
  AVMutableComposition *composition = [AVMutableComposition composition];
  AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
  CGSize videoSize = videoTrack.naturalSize;
  BOOL isPortrait_ = [self isVideoPortrait:asset];
  if(isPortrait_) {
      NSLog(@"video is portrait ");
      videoSize = CGSizeMake(videoSize.height, videoSize.width);
  }
  composition.naturalSize     = videoSize;
  videoComposition.renderSize = videoSize;
  // videoComposition.renderSize = videoTrack.naturalSize; //
  videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);

  AVMutableCompositionTrack *compositionVideoTrack;
  compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
  AVMutableVideoCompositionLayerInstruction *layerInst;
  layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
  [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
  AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
  inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
  inst.layerInstructions = [NSArray arrayWithObject:layerInst];
  videoComposition.instructions = [NSArray arrayWithObject:inst];
  return videoComposition;
}


-(BOOL) isVideoPortrait:(AVAsset *)asset
{
  BOOL isPortrait = FALSE;
  NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
  if([tracks    count] > 0) {
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0];

    CGAffineTransform t = videoTrack.preferredTransform;
    // Portrait
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
    {
        isPortrait = YES;
    }
    // PortraitUpsideDown
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

        isPortrait = YES;
    }
    // LandscapeRight
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
    {
        isPortrait = FALSE;
    }
    // LandscapeLeft
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
    {
        isPortrait = FALSE;
    }
   }
  return isPortrait;
}

答案 4 :(得分:1)

从iOS 5开始,您可以使用记录here的AVCaptureVideoDataOutput请求旋转的CVPixelBuffers。这为您提供了正确的方向,而无需使用AVAssetExportSession再次重新处理视频。