Swift 3 CGContext Memory Leak

时间:2017-04-11 07:48:21

标签: ios swift memory-leaks cgimage

我正在使用CGBitMapContext()将颜色空间转换为ARGB并获取像素数据值,我将malloc空间用于位图上下文并在完成后将其释放但我仍然看到仪器中的内存泄漏I'我想我可能做错了所以任何帮助都会受到赞赏。

这是ARGBBitmapContext函数

func createARGBBitmapContext(width: Int, height: Int) -> CGContext {
    var bitmapByteCount = 0
    var bitmapBytesPerRow = 0

    //Get image width, height
    let pixelsWide = width
    let pixelsHigh = height

    bitmapBytesPerRow = Int(pixelsWide) * 4
    bitmapByteCount = bitmapBytesPerRow * Int(pixelsHigh)


    let colorSpace = CGColorSpaceCreateDeviceRGB()
    // Here is the malloc call that Instruments complains of
    let bitmapData = malloc(bitmapByteCount)


    let context = CGContext(data: bitmapData, width: pixelsWide, height: pixelsHigh, bitsPerComponent: 8, bytesPerRow: bitmapBytesPerRow, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue)

    // Do I need to free something here first?
    return context!
}

这里我使用上下文将所有像素值检索为UInt8s列表(以及内存泄漏的位置)

extension UIImage {

    func ARGBPixelValues() -> [UInt8] {
        let width = Int(self.size.width)
        let height = Int(self.size.height)
        var pixels = [UInt8](repeatElement(0, count: width * height * 3))

        let rect = CGRect(x: 0, y: 0, width: width, height: height)
        let context = createARGBBitmapContext(inImage: self.cgImage!)
        context.clear(rect)
        context.draw(self.cgImage!, in: rect)

        var location = 0

        if let data = context.data {

            while location < (width * height) {
                let arrOffset = 3 * location
                let offset = 4 * (location)

                let R = data.load(fromByteOffset: offset + 1, as: UInt8.self)
                let G = data.load(fromByteOffset: offset + 2, as: UInt8.self)
                let B = data.load(fromByteOffset: offset + 3, as: UInt8.self)

                pixels[arrOffset]   = R
                pixels[arrOffset+1] = G
                pixels[arrOffset+2] = B

                location += 1
            }

            free(context.data) // Free the data consumed, perhaps this isn't right?
        }

        return pixels
    }
}

仪器报告的malloc误差为1.48MiB,适合我的图像尺寸(540 x 720)我释放数据但显然这是不对的。

我应该提一下,我知道你可以将nil传递给CGContext init(并且它将管理内存)但我更好奇为什么使用malloc创建一个问题是否有更多我应该知道的事情(我对Obj更熟悉-C)。

1 个答案:

答案 0 :(得分:0)

由于CoreGraphics不是由ARC处理的(像所有其他C库一样),因此即使在Swift中,也需要使用自动发布来包装代码。尤其是如果您不在主线程上(如果涉及到CoreGraphics,则不应该在主线程上... .userInitiated或更低版本是适当的)。

func myFunc() {  
    for _ in 0 ..< makeMoneyFast {
        autoreleasepool {
            // Create CGImageRef etc...
            // Do Stuff... whir... whiz... PROFIT!
        }
    }
}

对于那些关心的人,您的Objective-C也应该像这样包装:

BOOL result = NO;
NSMutableData* data = [[NSMutableData alloc] init];
@autoreleasepool {
    CGImageRef image = [self CGImageWithResolution:dpi
                                          hasAlpha:hasAlpha
                                     relativeScale:scale];

    NSAssert(image != nil, @"could not create image for TIFF export");

    if (image == nil)
        return nil;

    CGImageDestinationRef destRef = CGImageDestinationCreateWithData((CFMutableDataRef)data, kUTTypeTIFF, 1, NULL);
    CGImageDestinationAddImage(destRef, image, (CFDictionaryRef)options);
    result = CGImageDestinationFinalize(destRef);
    CFRelease(destRef);
}

if (result) {
    return [data copy];
} else {
    return nil;
}

有关详细信息,请参见this answer