OS X objective-C应用程序使用过多的内存

时间:2014-04-27 16:31:55

标签: objective-c macos memory

我正在编写一个OS X应用程序,它将使用一系列图像创建视频。它是使用此处的代码开发的:Make movie file with picture Array and song file, using AVAsset,但不包括音频部分。

代码功能并创建一个mpg文件。

问题在于记忆压力。它似乎没有释放任何内存。使用XCode Instruments我发现最大的罪魁祸首是:

CVPixelBufferCreate
[image TIFFRepresentation];
CGImageSourceCreateWithData
CGImageSourceCreateImageAtIndex

我尝试添加代码来发布,但ARC应该已经这样做了。 最终OS X会挂起或崩溃。

不确定如何处理内存问题。代码中没有malloc。 我愿意接受建议。似乎许多其他人使用了相同的代码。

这是基于以上链接的代码:

- (void)ProcessImagesToVideoFile:(NSError **)error_p size:(NSSize)size videoFilePath:(NSString *)videoFilePath jpegs:(NSMutableArray *)jpegs fileLocation:(NSString *)fileLocation
{

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:videoFilePath]
                                                       fileType:AVFileTypeMPEG4
                                                          error:&(*error_p)];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];
     AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                 sourcePixelBufferAttributes:nil];



    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);

    videoWriterInput.expectsMediaDataInRealTime = YES;

    [videoWriter addInput:videoWriterInput];
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    CVPixelBufferRef buffer = NULL;

    //Write all picture array in movie file.

    int frameCount = 0;

    for(int i = 0; i<[jpegs count]; i++)
    {
        NSString *filePath = [NSString stringWithFormat:@"%@%@", fileLocation, [jpegs objectAtIndex:i]];
        NSImage *jpegImage = [[NSImage alloc ]initWithContentsOfFile:filePath];
        CMTime frameTime = CMTimeMake(frameCount,(int32_t) 24);

        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30)
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {
                if ((frameCount % 25) == 0)
                {
                    NSLog(@"appending %d to %@ attemp %d\n", frameCount, videoFilePath, j);
                }


                buffer = [self pixelBufferFromCGImage:jpegImage  andSize:size];
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                if (append_ok == NO) //failes on 3GS, but works on iphone 4
                {
                    NSLog(@"failed to append buffer");
                    NSLog(@"The error is %@", [videoWriter error]);
                }
                //CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
                //NSParameterAssert(bufferPool != NULL);

                if(buffer)
                {
                    CVPixelBufferRelease(buffer);
                    //CVBufferRelease(buffer);
                }
            }
            else
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }

        if (!append_ok)
        {
            printf("error appending image %d times %d\n", frameCount, j);
        }

        frameCount++;
            //CVBufferRelease(buffer);
        jpegImage = nil;
        buffer = nil;
    }

    //Finish writing picture:
    [videoWriterInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^(){
    NSLog (@"finished writing");

    }];
}

- (CVPixelBufferRef) pixelBufferFromCGImage: (NSImage *) image andSize:(CGSize) size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES],     kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES],     kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                      size.width,
                                      size.height,
                                      kCVPixelFormatType_32ARGB,
                                      (__bridge CFDictionaryRef) options,
                                      &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height,
                                             8, 4*size.width, rgbColorSpace,
                                             kCGImageAlphaPremultipliedFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGImageRef imageRef = [self nsImageToCGImageRef:image];
    CGRect imageRect = CGRectMake(0, 0, CGImageGetWidth(imageRef),     CGImageGetHeight(imageRef));
    CGContextDrawImage(context, imageRect, imageRef);


    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    imageRef = nil;
    context = nil;
    rgbColorSpace = nil;
    return pxbuffer;
}

- (CGImageRef)nsImageToCGImageRef:(NSImage*)image;
{
    NSData * imageData = [image TIFFRepresentation];// memory hog
    CGImageRef imageRef;
    if(!imageData) return nil;
    CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
    imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);

    imageData = nil;
    imageSource = nil;
    return imageRef;
}

2 个答案:

答案 0 :(得分:0)

ARC仅适用于retainable object pointers。 ARC文档将它们定义为

  

可保留对象指针(或“可保留指针”)是a的值   可保留对象指针类型(“可保留类型”)。有三种   各种可保留的对象指针类型:

     
      
  1. 阻止指针(通过将插入符号(^)声明符sigil应用于a   功能类型)

  2.   
  3. Objective-C对象指针(id,Class,NSFoo *等)

  4.   
  5. 属性标记的typedef((NSObject))其他指针类型,   例如int *和CFStringRef,不受ARC的语义和   限制。

  6.   

您已在此处明确调用发布

CGContextRelease(context);

您应该对其他对象执行相同操作。像

CVPixelBufferRelease(pxbuffer);

for pxbuffer

答案 1 :(得分:0)

您的代码正在使用ARC,但您调用的库可能没有使用ARC。他们可能依靠较旧的自动释放池系统来释放内存。

你应该阅读它是如何工作的,这是每个Obj-C开发人员需要记住的基本内容,但基本上任何对象都可以添加到对象的当前“池”中,当池是释放。

默认情况下,每次应用程序进入空闲状态时,主线程上的池都会清空。这通常可以正常工作,因为主线程永远不应该忙于超过百分之几秒,并且在这段时间内你无法真正建立大量内存。

当您执行冗长且占用内存的操作时,您需要手动设置自动释放池,这通常放在forwhile循环内(尽管您实际上可以将它们放在任何您想要的位置,这只是最有用的场景):

for ( ... ) {
  @autoreleasepool {
    // do somestuff
  }
}

此外,ARC仅适用于Objective C代码。它不适用于由CGColorSpaceCreateDeviceRGB()CVPixelBufferCreate()等C函数创建的对象。确保你手动释放所有这些。