如何从iPhone框架创建视频

时间:2012-05-18 05:19:28

标签: objective-c video avfoundation mpmovieplayercontroller

我已经完成了R& D并且在如何根据MPMoviePlayerController中播放的视频文件的图像获取帧数方面取得了成功。

获取此代码中的所有帧,并将所有图像保存在一个阵列中。

for(int i= 1; i <= moviePlayerController.duration; i++)
{
    UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame];
    [arrImages addObject:img];
}

现在的问题是,在更改了一些图像文件之后,比如为图像添加情感并添加过滤器,例如;电影真实,黑白,我们如何再次创建视频并以相同的帧速率将相同的视频存储在文档目录中,而不会丢失视频质量。

更改了一些图像后,我按照代码再次保存了该视频。

- (void) writeImagesAsMovie:(NSString*)path 
{
    NSError *error  = nil;
    UIImage *first = [arrImages objectAtIndex:0];
    CGSize frameSize = first.size;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:640], AVVideoWidthKey,
                                   [NSNumber numberWithInt:480], AVVideoHeightKey,
                                   nil];
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings] retain];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    NSParameterAssert(writerInput);
    NSParameterAssert([videoWriter canAddInput:writerInput]);
    [videoWriter addInput:writerInput];

    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    int frameCount = 0;
    CVPixelBufferRef buffer = NULL;
    for(UIImage *img in arrImages)
    {
        buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; 

            if (adaptor.assetWriterInput.readyForMoreMediaData) 
            {
                CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
                [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];

                if(buffer)
                    CVBufferRelease(buffer);
            }
        frameCount++;
    } 

     [writerInput markAsFinished];
     [videoWriter finishWriting];
}


- (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                 frameSize.height, 8, 4*frameSize.width, rgbColorSpace, 
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

我是这个话题的新手,所以请帮我解决这个问题。

2 个答案:

答案 0 :(得分:6)

答案 1 :(得分:0)

您需要为屏幕渲染的每一帧获取图像。我使用UIGetScreenImage()私有API来做同样的事情。并使用AVAssetWriter将这些图像写为电影。

我写了一个包装器,只需几行代码即可完成,需要参考这个开源组件:

https://www.cocoacontrols.com/controls/iqprojectvideo