从图像阵列创建mp4视频时的内存警告

时间:2014-10-03 04:38:29

标签: ios

我正在尝试创建图像文件的视频文件。我在NSArray中设置图像文件的名称。当图像文件的数量很大(超过80或100)时,我会收到内存警告,有时会出现应用程序崩溃的情况。这是我的代码:

   -(void)writeImageAsMovie:(NSArray *)images toPath:(NSString*)path size:(CGSize)size duration:(int)duration
{

    NSError *error = nil;

    videoWriter = [[AVAssetWriter alloc] initWithURL:
                   [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                               error:&error];


    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];
    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings] ;




    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];


    NSParameterAssert(writerInput);
    NSParameterAssert([videoWriter canAddInput:writerInput]);
    [videoWriter addInput:writerInput];


    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage.

    //Write samples:
    for (int i=0; i<images.count ; i++) {

        UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]];
        int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue];
        buffer = [self pixelBufferFromCGImage:image.CGImage];
        while(! adaptor.assetWriterInput.readyForMoreMediaData );
        [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)];
        image=nil;
    }

    while(!adaptor.assetWriterInput.readyForMoreMediaData);


    //Finish the session:
    [writerInput markAsFinished];

    [videoWriter finishWritingWithCompletionHandler:^(){
        NSLog (@"finished writing %d",images.count);
    }];

    NSLog(@"%d",[videoWriter status]);
    while([videoWriter status] != AVAssetWriterStatusFailed && [videoWriter status] != AVAssetWriterStatusCompleted) {
        NSLog(@"Status: %d", [videoWriter status]);
        sleep(1);
    }
    NSLog(@"%d",[videoWriter status]);
    NSString *tmpdir = NSTemporaryDirectory();
    NSString *mydir = [tmpdir stringByAppendingPathComponent:@"vidimages"];
    [[NSFileManager defaultManager] removeItemAtPath:mydir error:nil];

    }




    - (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
    {
        CGFloat screenWidth = [[UIScreen mainScreen] bounds].size.width;

        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;

        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, screenWidth,
                                              screenWidth, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef) options,
                                              &pxbuffer);
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

        NSParameterAssert(pxdata != NULL);

        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context1 = CGBitmapContextCreate(pxdata, screenWidth,
                                                      screenWidth, 8, 4*screenWidth, rgbColorSpace,
                                                      kCGImageAlphaNoneSkipLast);
        NSParameterAssert(context1);
        CGContextConcatCTM(context1, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context1, CGRectMake(0, 0, screenWidth,
                                                screenWidth), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context1);

        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

        return pxbuffer;
    }

1 个答案:

答案 0 :(得分:0)

在这里看起来正在分配一块自动释放的UIImage和NSData对象:

for (int i=0; i<images.count ; i++) {

    UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]];
    int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue];
    buffer = [self pixelBufferFromCGImage:image.CGImage];
    while(! adaptor.assetWriterInput.readyForMoreMediaData );
    [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)];
    image=nil;
}

为了让每个释放的自动释放对象都通过循环,并停止内存使用量迅速增加,添加一个自动释放池,即:

for (int i=0; i<images.count ; i++) {
    @autoreleasepool {
        UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]];
        int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue];
        buffer = [self pixelBufferFromCGImage:image.CGImage];
        while(! adaptor.assetWriterInput.readyForMoreMediaData );
        [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)];
        image=nil;
    }
}

如果您不熟悉,请查看Apple doco中的自动释放池 - 是的,它们仍与ARC相关。