我使用这段代码来缩放和旋转使用相机拍摄的图像。当我使用它时,我可以看到巨大的内存峰值。像20 MB的东西。当我使用乐器时,我可以看到这一行:
CGContextDrawImage(ctxt,orig,self.CGImage);
持有20 MB。对于全分辨率照片,这是正常的吗? iPhone 4S可以处理它。但是由于这段代码,旧设备崩溃了。
我重新缩放图像后,我需要在NSData中使用它,所以我使用UIImageJPEGRepresentation()方法。这一起使得存储器峰值更高。内存使用量达到70 MB几秒钟。
是的,我确实阅读了几乎所有关于内存使用情况的iOS相机相关问题。但没有答案。
// WBImage.mm -- extra UIImage methods
// by allen brunson march 29 2009
#include "WBImage.h"
static inline CGFloat degreesToRadians(CGFloat degrees)
{
return M_PI * (degrees / 180.0);
}
static inline CGSize swapWidthAndHeight(CGSize size)
{
CGFloat swap = size.width;
size.width = size.height;
size.height = swap;
return size;
}
@implementation UIImage (WBImage)
// rotate an image to any 90-degree orientation, with or without mirroring.
// original code by kevin lohman, heavily modified by yours truly.
// http://blog.logichigh.com/2008/06/05/uiimage-fix/
-(UIImage*)rotate:(UIImageOrientation)orient
{
CGRect bnds = CGRectZero;
UIImage* copy = nil;
CGContextRef ctxt = nil;
CGRect rect = CGRectZero;
CGAffineTransform tran = CGAffineTransformIdentity;
bnds.size = self.size;
rect.size = self.size;
switch (orient)
{
case UIImageOrientationUp:
return self;
case UIImageOrientationUpMirrored:
tran = CGAffineTransformMakeTranslation(rect.size.width, 0.0);
tran = CGAffineTransformScale(tran, -1.0, 1.0);
break;
case UIImageOrientationDown:
tran = CGAffineTransformMakeTranslation(rect.size.width,
rect.size.height);
tran = CGAffineTransformRotate(tran, degreesToRadians(180.0));
break;
case UIImageOrientationDownMirrored:
tran = CGAffineTransformMakeTranslation(0.0, rect.size.height);
tran = CGAffineTransformScale(tran, 1.0, -1.0);
break;
case UIImageOrientationLeft:
bnds.size = swapWidthAndHeight(bnds.size);
tran = CGAffineTransformMakeTranslation(0.0, rect.size.width);
tran = CGAffineTransformRotate(tran, degreesToRadians(-90.0));
break;
case UIImageOrientationLeftMirrored:
bnds.size = swapWidthAndHeight(bnds.size);
tran = CGAffineTransformMakeTranslation(rect.size.height,
rect.size.width);
tran = CGAffineTransformScale(tran, -1.0, 1.0);
tran = CGAffineTransformRotate(tran, degreesToRadians(-90.0));
break;
case UIImageOrientationRight:
bnds.size = swapWidthAndHeight(bnds.size);
tran = CGAffineTransformMakeTranslation(rect.size.height, 0.0);
tran = CGAffineTransformRotate(tran, degreesToRadians(90.0));
break;
case UIImageOrientationRightMirrored:
bnds.size = swapWidthAndHeight(bnds.size);
tran = CGAffineTransformMakeScale(-1.0, 1.0);
tran = CGAffineTransformRotate(tran, degreesToRadians(90.0));
break;
default:
// orientation value supplied is invalid
assert(false);
return nil;
}
UIGraphicsBeginImageContext(rect.size);
ctxt = UIGraphicsGetCurrentContext();
switch (orient)
{
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
CGContextScaleCTM(ctxt, -1.0, 1.0);
CGContextTranslateCTM(ctxt, -rect.size.height, 0.0);
break;
default:
CGContextScaleCTM(ctxt, 1.0, -1.0);
CGContextTranslateCTM(ctxt, 0.0, -rect.size.height);
break;
}
CGContextConcatCTM(ctxt, tran);
CGContextDrawImage(ctxt, bnds, self.CGImage);
copy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return copy;
}
-(UIImage*)rotateAndScaleFromCameraWithMaxSize:(CGFloat)maxSize
{
UIImage* imag = self;
imag = [imag rotate:imag.imageOrientation];
imag = [imag scaleWithMaxSize:maxSize];
return imag;
}
-(UIImage*)scaleWithMaxSize:(CGFloat)maxSize
{
return [self scaleWithMaxSize:maxSize quality:kCGInterpolationHigh];
}
-(UIImage*)scaleWithMaxSize:(CGFloat)maxSize
quality:(CGInterpolationQuality)quality
{
CGRect bnds = CGRectZero;
UIImage* copy = nil;
CGContextRef ctxt = nil;
CGRect orig = CGRectZero;
CGFloat rtio = 0.0;
CGFloat scal = 1.0;
bnds.size = self.size;
orig.size = self.size;
rtio = orig.size.width / orig.size.height;
if ((orig.size.width <= maxSize) && (orig.size.height <= maxSize))
{
return self;
}
if (rtio > 1.0)
{
bnds.size.width = maxSize;
bnds.size.height = maxSize / rtio;
}
else
{
bnds.size.width = maxSize * rtio;
bnds.size.height = maxSize;
}
UIGraphicsBeginImageContext(bnds.size);
ctxt = UIGraphicsGetCurrentContext();
scal = bnds.size.width / orig.size.width;
CGContextSetInterpolationQuality(ctxt, quality);
CGContextScaleCTM(ctxt, scal, -scal);
CGContextTranslateCTM(ctxt, 0.0, -orig.size.height);
CGContextDrawImage(ctxt, orig, self.CGImage);
copy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return copy;
}
@end
答案 0 :(得分:3)
我最终使用的是imageIO,内存更少!
-(UIImage *)resizeImageToMaxDimension: (float) dimension withPaht: (NSString *)path
{
NSURL *imageUrl = [NSURL fileURLWithPath:path];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)imageUrl, NULL);
NSDictionary *thumbnailOptions = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, kCGImageSourceCreateThumbnailWithTransform,
kCFBooleanTrue, kCGImageSourceCreateThumbnailFromImageAlways,
[NSNumber numberWithFloat:dimension], kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, (__bridge CFDictionaryRef)thumbnailOptions);
UIImage *resizedImage = [UIImage imageWithCGImage:thumbnail];
CFRelease(thumbnail);
CFRelease(imageSource);
return resizedImage;
}
答案 1 :(得分:1)
这是正确的,它来自您用相机拍摄的照片,较旧的设备使用分辨率较低的相机,这意味着使用iPhone 3g拍摄的图像的分辨率(因此尺寸)小于iPhone4上的图像。图像通常是压缩的,但是当它们在内存中打开以进行某种操作时,它们必须进行解压缩,它们所需的大小实际上比文件中的大小要大,因为如果我记得很清楚,那就是number_of_pixel_in_row*number_of_pixel_in_height*byte_for_pixel
。登记/>
再见,
安德烈
答案 2 :(得分:0)
在方法的最后和return copy;
:
CGContextRelease(ctxt);