在制作图像到视频时,iOS-CVPixelBufferCreate内存无法正确释放

时间:2015-02-09 11:47:05

标签: ios objective-c xcode memory

我正在将图像制作成视频。 但由于内存警告总是崩溃,CVPixelBufferCreate分配太多。 不知道如何处理它。我见过很多类似的话题,但没有一个能解决我的问题。

enter image description here

这是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path
{
    NSError *error  = nil;
    UIImage *first = [array objectAtIndex:0];
    CGSize frameSize = first.size;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey,
                                   [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey,
                                   nil];

    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings];

    self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    [videoWriter addInput:writerInput];

    //Start Session
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    int frameCount = 0;
    CVPixelBufferRef buffer = NULL;
    for(UIImage *img in array)
    {
        buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
        if (self.adaptor.assetWriterInput.readyForMoreMediaData)
        {
            CMTime frameTime =  CMTimeMake(frameCount,FPS);
            [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
        }
        if(buffer)
            CVPixelBufferRelease(buffer);

        frameCount++;
    }

    [writerInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^{

        if (videoWriter.status == AVAssetWriterStatusFailed) {

            NSLog(@"Movie save failed.");

        }else{

            NSLog(@"Movie saved.");
        }
    }];

    NSLog(@"Finished.);
}


- (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];

    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst;
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata,
                                                 frameSize.width,
                                                 frameSize.height,
                                                 8,
                                                 4*frameSize.width,
                                                 rgbColorSpace,
                                                 bitmapInfo);

    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

更新

我将视频制作成小段。 添加[NSThread sleepForTimeInterval:0.00005]后;在循环。 记忆只是神奇地释放了。

但是,由于这条线,这导致我的UI卡住了几秒钟。有更好的解决方案吗?

for(UIImage *img in array)
{
    buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
    //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
    if (adaptor.assetWriterInput.readyForMoreMediaData)
    {
        CMTime frameTime =  CMTimeMake(frameCount,FPS);
        [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    }

    if(buffer)
        CVPixelBufferRelease(buffer);

    frameCount++;

    [NSThread sleepForTimeInterval:0.00005];
}

这是记忆:

enter image description here

2 个答案:

答案 0 :(得分:1)

通过快速查看代码,我看不出CVBuffer本身的管理有什么问题 我认为它可能是你问题的根源是UIImages的数组 UIImage有这种行为,直到您请求CGImage属性或绘制它,附加的图像不会在内存中解码,因此未使用图像在内存中的影响很小。
你的枚举调用每个图像上的CGImage属性,你永远不会删除它们,这可以解释内存分配的持续增加。

答案 1 :(得分:0)

如果以后不使用Images。你可以这样做:

    [images enumerateObjectsUsingBlock:^(UIImage * _Nonnull img, NSUInteger idx, BOOL * _Nonnull stop) {
        CVPixelBufferRef pixelBuffer = [self pixelBufferFromCGImage:img.CGImage frameSize:[VDVideoEncodeConfig globalConfig].size];

        CMTime frameTime = CMTimeMake(frameCount, (int32_t)[VDVideoEncodeConfig globalConfig].frameRate);
        frameCount++;
        [_assetRW appendNewSampleBuffer:pixelBuffer pst:frameTime];

        CVPixelBufferRelease(pixelBuffer);
        // This can release the memory
        // The Image.CGImageRef result in the memory leak you see in the Instruments
        images[idx] = [NSNull null];
    }];