如何使用AVAssetWriter保存录制的视频?

时间:2018-11-20 16:20:32

标签: ios avfoundation

我尝试了许多其他博客,并且堆栈溢出。我没有解决方案,我可以创建带有预览的自定义相机。我需要带有自定义框架的视频,这就是为什么我使用AVAssetWriter。但是我无法将录制的视频保存到文档中。我这样尝试过,

-(void) initilizeCameraConfigurations {

if(!captureSession) {

    captureSession = [[AVCaptureSession alloc] init];
    [captureSession beginConfiguration];
    captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    self.view.backgroundColor = UIColor.blackColor;
    CGRect bounds = self.view.bounds;
    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
    captureVideoPreviewLayer.bounds = self.view.frame;
    captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
    [self.view.layer addSublayer:captureVideoPreviewLayer];
    [self.view bringSubviewToFront:self.controlsBgView];
}


// Add input to session
NSError *err;
videoCaptureDeviceInput  = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

if([captureSession canAddInput:videoCaptureDeviceInput]) {
    [captureSession addInput:videoCaptureDeviceInput];
}

docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:300], AVVideoWidthKey,
                               [NSNumber numberWithInt:300], AVVideoHeightKey,
                               nil];




 writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
 writerInput.expectsMediaDataInRealTime = YES;
 writerInput.transform = CGAffineTransformMakeRotation(M_PI);

 NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
 [NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
 [NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
 nil];

 assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


 if([assetWriter canAddInput:writerInput]) {
 [assetWriter addInput:writerInput];
 }

     // Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
    [captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}


// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];

[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
    [captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];

}
-(IBAction)startStopVideoRecording:(id)sender {

if(captureSession) {
    if(isVideoRecording) {
        [writerInput markAsFinished];

        [assetWriter finishWritingWithCompletionHandler:^{
            NSLog(@"Finished writing...checking completion status...");
            if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
            {
                // Video saved
            } else
            {
                NSLog(@"#123 Video writing failed: %@", assetWriter.error);
            }

        }];

    } else {

        [assetWriter startWriting];
        [assetWriter startSessionAtSourceTime:kCMTimeZero];
        isVideoRecording = YES;

    }
}
}
-(NSString *) getDocumentsUrl {

NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
    NSError *err;
    [[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;


}


@end

如果有任何错误,请纠正我。预先谢谢你。

2 个答案:

答案 0 :(得分:1)

您没有说出实际出了什么问题,但是您的代码有两点看起来不对:

docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];

看起来像它在创建@"/path/Movie/.mov"这样的不需要的路径时所需的路径:

docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];

您的时间安排有误。您的资产写作者从时间0开始,而sampleBuffer则从CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0开始,因此,请这样做:

-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if(firstSampleBuffer) {
        [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
    }

    [writerInput appendSampleBuffer:sampleBuffer];

}

答案 1 :(得分:1)

从概念上讲,您必须涉及主要功能区域:一个生成视频帧-AVCaptureSession,以及附加到视频帧的所有内容-另一个将视频帧写入文件的情况-{ {1}}和附加的输入。
您的代码存在的问题是:两者之间没有任何联系。从捕获会话中出来的视频帧/图像不会传递到资产编写器输入。

此外,AVAssetWriter方法AVCaptureStillImageOutput在任何地方都没有调用,因此捕获会话实际上不会产生任何帧。

因此,至少要实现以下内容:

-captureStillImageAsynchronouslyFromConnection:completionHandler:

删除-(IBAction)captureStillImageAndAppend:(id)sender { [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler: ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error) { // check error, omitted here if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))]; [writerInput appendSampleBuffer:imageDataSampleBuffer]; }]; } ,不使用。

但是AVAssetWriterInputPixelBufferAdaptor存在问题:

  • 仅用于生成静止图像,而不是视频

  • 如果资产编写器输入配置为压缩附加的样本缓冲区(AVCaptureStillImageOutput),则必须将其配置为生成未压缩的样本缓冲区

  • 在iOS下已弃用

如果您实际上要制作视频,而不是一系列静止图像,请代替stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};向捕获会话中添加AVCaptureStillImageOutput。它需要一个委托和一个串行调度队列来输出样本缓冲区。委托人必须实现这样的事情:

AVCaptureVideoDataOutput

请注意

  • 您将要确保-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection { if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))]; [writerInput appendSampleBuffer:sampleBuffer]; } 仅在实际记录时才输出帧;从捕获会话中添加/删除它,或在startStopVideoRecording操作中启用/禁用其连接

  • 在开始其他录音之前将AVCaptureVideoDataOutput重置为startTime