GLPaint保存图像

时间:2009-06-03 19:33:18

标签: iphone objective-c opengl-es quartz-graphics

我正在尝试在iPhone上开发一个复杂的绘画应用程序。我目前正在使用Quartz(例如CGContext ...)。不幸的是,Quartz开销对于我正在进行的绘制类型来说太慢了,我正在使用GLPaint示例作为参考点移植到OpenGL调用。

有没有办法从EAGLview类中获取UIImage / CGImage(相当于Quartz的UIGraphicsGetImageFromCurrentImageContext)?基本上我需要保存GLPaint应用程序绘制的图片。

5 个答案:

答案 0 :(得分:15)

-(UIImage *) saveImageFromGLView
{
    NSInteger myDataLength = 320 * 480 * 4;
    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, 320, 480, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    // gl renders "upside down" so swap top to bottom into new array.
    // there's gotta be a better way, but this works.
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
    for(int y = 0; y <480; y++)
    {
        for(int x = 0; x <320 * 4; x++)
        {
            buffer2[(479 - y) * 320 * 4 + x] = buffer[y * 4 * 320 + x];
        }
    }
    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * 320;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
    // make the cgimage
    CGImageRef imageRef = CGImageCreate(320, 480, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
    // then make the uiimage from that
    UIImage *myImage = [UIImage imageWithCGImage:imageRef];
    return myImage;
}

答案 1 :(得分:7)

与@ Quakeboy的答案相同,但是在视图中传递以便可以动态确定大小(我将其用于我的通用应用程序):

- (UIImage *)saveImageFromGLView:(UIView *)glView {
    int width = glView.frame.size.width;
    int height = glView.frame.size.height;

    NSInteger myDataLength = width * height * 4;
    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    // gl renders "upside down" so swap top to bottom into new array.
    // there's gotta be a better way, but this works.
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
    for(int y = 0; y < height; y++)
    {
        for(int x = 0; x < width * 4; x++)
        {
            buffer2[((height - 1) - y) * width * 4 + x] = buffer[y * 4 * width + x];
        }
    }
    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
    // make the cgimage
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
    // then make the uiimage from that
    UIImage *myImage = [UIImage imageWithCGImage:imageRef];
    return myImage;
}

答案 2 :(得分:3)

这绝对有可能。诀窍是使用glReadPixels将图像数据从OpenGL帧缓冲区中提取到可以使用的内存中。一旦有了指向图像数据的指针,就可以使用CGDataProviderCreateWithData和CGImageCreate从数据中创建CGImage。我正在开发一个基于OpenGL的绘图应用程序,它经常使用这种技术!

答案 3 :(得分:0)

此代码不会像上述解决方案一样泄漏内存并考虑动态视图大小以及视网膜与标准显示:

-(BOOL)iPhoneRetina{
    return ([[UIScreen mainScreen] respondsToSelector:@selector(displayLinkWithTarget:selector:)] && ([UIScreen mainScreen].scale == 2.0))?YES:NO;
}

void releasePixels(void *info, const void *data, size_t size) {
    free((void*)data);
}

-(UIImage *) glToUIImage{

    int imageWidth, imageHeight;

    int scale = [self iPhoneRetina]?2:1;

    imageWidth = self.frame.size.width*scale;
    imageHeight = self.frame.size.height*scale;

    NSInteger myDataLength = imageWidth * imageHeight * 4;

    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, imageWidth, imageHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, releasePixels);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * imageWidth;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo =  kCGImageAlphaPremultipliedLast;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage

    CGImageRef imageRef = CGImageCreate(imageWidth, imageHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    UIImage *myImage = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationDownMirrored]; //Render image flipped, since OpenGL's data is mirrored

    CGImageRelease(imageRef);
    CGColorSpaceRelease(colorSpaceRef);

    CGDataProviderRelease(provider);

    return myImage;
}

其他人泄漏内存,因为CGDataProviderCreateWithData的最后一个参数应该是释放内存的函数,并且它们也省略了CGRelease函数。

答案 4 :(得分:-1)

void SaveScreenImage()

{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
CGImageRef cgImage = UIGetScreenImage();
void *imageBytes = NULL;
if (cgImage == NULL) {
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
imageBytes = malloc(320 * 480 * 4);
CGContextRef context = CGBitmapContextCreate(imageBytes, 320, 480, 8, 320 * 4, colorspace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorspace);
for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
        CGRect bounds = [window bounds];
        CALayer *layer = [window layer];
        CGContextSaveGState(context);
        if ([layer contentsAreFlipped]) {
            CGContextTranslateCTM(context, 0.0f, bounds.size.height);
            CGContextScaleCTM(context, 1.0f, -1.0f);
        }
[layer renderInContext:(CGContextRef)context];
        CGContextRestoreGState(context);
    }
    cgImage = CGBitmapContextCreateImage(context);
    CGContextRelease(context);
}
UIImage *image=[UIImage imageWithCGImage:cgImage];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); 
[pool release];
}

此代码将按照您在屏幕上看到的方式保存。但它可能是私人API。