应用程序在iOS 7中处理图像时崩溃

时间:2014-04-09 12:31:49

标签: ios iphone image-processing cocos2d-iphone

我正在开发一个图像处理应用程序。可以在图像上有多个图层,并可以将它们保存到磁盘。这是执行该操作的代码(TransparentBG.png 3000x3000 透明图片):

CCSprite *blankImage = [CCSprite spriteWithFile:@"assetsFullSize/TransparentBG.png"];
//CCSprite *blankImage = [CCSprite spriteWithFile:@"assets/centerPaneBG.png"];
blankImage.tag=SAVE_IMAGE_BASE_TAG + [[AppManager instance] generateNextSaveImageIndex];
NSLog(@"   blankImage.tag = %i",blankImage.tag);
NSLog(@"   blankImage.size = %@",NSStringFromCGSize(blankImage.contentSize));
for(int i=1; i<[layers count]; i++)
{
    NSLog(@"   i = %i",i);

    ImageFeature *feature = [layers objectAtIndex:i];
    CCSprite *layer = (CCSprite *)[self getChildByTag:LAYER_INDEX_BASE + i];
    NSLog(@"   ********* layer.position = %@",NSStringFromCGPoint(layer.position));
    NSLog(@"   ********* feature.posX,posYition = %i,%i",feature.posX,feature.posY);
    //        [layer removeFromParent];
    //        CCSprite *layerCopy = [layer copy];
    CCTexture2D *texture = [layer texture];
    CCSprite *layerCopy = [CCSprite spriteWithTexture:texture];
    layerCopy.anchorPoint = layer.anchorPoint;
    NSLog(@"   anchorPoint = %@",NSStringFromCGPoint(layer.anchorPoint));
    //        layerCopy.position = ccpAdd(layer.position,ccp(-LEFT_PANE_WIDTH,0));
    //        layerCopy.position = ccp([self getImageFeature_posX_fomSpritePosition:layer],[self getImageFeature_posY_fomSpritePosition:layer]);
    layerCopy.position = ccp((feature.posX/3000) * blankImage.contentSize.width,(feature.posY/3000) * blankImage.contentSize.height);
    layerCopy.position = IS_RETINA ? ccp(feature.posX / 2,feature.posY / 2) : ccp(feature.posX,feature.posY);
    NSLog(@"   ********* layerCopy.position = %@",NSStringFromCGPoint(layerCopy.position));
    layerCopy.color = layer.color;
    layerCopy.scaleX = layer.scaleX / VISUAL_SCALING_FACTOR;
    layerCopy.scaleY = layer.scaleY / VISUAL_SCALING_FACTOR;
    layerCopy.rotation = layer.rotation;
    layerCopy.opacity = layer.opacity;
    [blankImage addChild:layerCopy z:i tag:layer.tag];
    //        layer.anchorPoint = ccp(0.5,0.5);
    //        layer.position = ccpAdd(layer.position,ccp(-LEFT_PANE_WIDTH,0));
    //        [blankImage addChild:layer z:i tag:layer.tag];
}

//    CCSprite *attribution = [CCSprite spriteWithFile:@"assets/pikpark.png"];

//    CCSprite *attribution = [CCSprite spriteWithFile:@"assetsFullSize/pikpark.png"];
//    attribution.anchorPoint = ccp(0.5,0.5);
//    attribution.position = ccp(blankImage.contentSize.width-  (attribution.contentSize.width/2.0),attribution.contentSize.height/2.0);
//    attribution.opacity = 64;
//    [blankImage addChild:attribution z:8999 tag:ATTRIBUTION];

//    blankImage.scale = 300.0/blankImage.contentSize.height;
CGPoint p = blankImage.anchorPoint;
[blankImage setAnchorPoint:ccp(0,0)];

//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:300 height:300];
//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:1500 height:1500];
//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:3000 height:3000];
CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:blankImage.contentSize.width height:blankImage.contentSize.height];

[renderer begin];
[blankImage visit];
[renderer end];

[blankImage setAnchorPoint:p];

UIImage *thumbImage = [renderer getUIImage];

NSLog(@"   thumbImage.size = %@",NSStringFromCGSize([thumbImage size]));
NSString *key = [NSString stringWithFormat:@"%i",blankImage.tag];
NSLog(@"   key = %@",key);
CCSprite *renderedSprite = [CCSprite spriteWithCGImage:thumbImage.CGImage key:key];
NSLog(@"   width=%3f   height=%3f",renderedSprite.contentSize.width,renderedSprite.contentSize.height);

// And save to UserDocs
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *galleryDirectory = [documentsDirectory stringByAppendingPathComponent:@"gallery"];
NSLog(@"   galleryDirectory = %@",galleryDirectory);

NSString *saveFileName = [NSString stringWithFormat:@"image_%i.png",blankImage.tag];
NSLog(@"   saveFileName = %@",saveFileName);

NSString *galleryPath = [galleryDirectory stringByAppendingPathComponent:saveFileName];
NSLog(@"   galleryPath = %@",galleryPath);

NSError *error;
NSData *imageData = UIImagePNGRepresentation(thumbImage);
[imageData writeToFile:galleryPath options:NULL error:&error];// atomically:NO];
NSLog(@"   GALLERY IMAGE SAVED!");

我在模拟器上测试过,效果很好。但是当我在我的iPad2上进行测试时,它崩溃会给内存压力带来致命的异常。

通过断点,我可以看到以下行使应用程序崩溃,从上面的代码集中给出内存压力异常。

UIImage *thumbImage = [renderer getUIImage];

如果我将CCRenderTexture *renderer的大小更改为 300x300 ,则该应用会停止崩溃。但它严重影响了所保存图像的质量和尺寸。 3000x3000 可生成高质量的图像。我尝试使用despatch_async,但没有成功。

我是否可以解决memory pressure问题?请帮忙。

1 个答案:

答案 0 :(得分:2)

3000x3000纹理消耗超过34兆字节的纹理内存。

您可以从图像(x1)创建纹理。然后为渲染(x2)创建渲染纹理。然后从渲染纹理(x3)创建一个UIImage。最后,使用UIImagePNGRepresentation(x4)创建一个NSData。

因此,此时纹理内存保存在内存中的各种缓冲区中,至少4次= 136 MB。

我说&#34;至少&#34;因为在cocos2d中已知一些纹理加载效率低下,并且也可能存在于UIImage等中。例如,创建CCTexture2D实际上可能会创建两个相同大小的缓冲区,因此实际内存使用量可能为170 MB。运行Instruments以了解实际使用的内存量。

您可以尝试的一件事是将各个部分分开,以允许临时缓冲区从内存中释放。例如,在加载CCTexture2D之后,不要立即创建精灵并渲染纹理,而是调度performSelectorInBackground:afterDelay:延迟可能是20秒。您可以在从渲染纹理创建UIImage之后执行相同的操作,以便将PNG表示保持一段时间。