捕获Isgl3d输出为图像

时间:2012-02-27 00:42:36

标签: iphone objective-c ios opengl-es screenshot

我在使用Isgl3d控制视图的UIImage快照时遇到了一些困难。似乎无论我做什么,我最终都会得到一个黑色方块。

我在视图中有一个工作相机视图和一个3D模型,我尝试使用缓冲方法和常规屏幕捕获来获取图像,但没有任何有效的结果。

有没有人有成功拍摄Isgl3d视图的源代码?

4 个答案:

答案 0 :(得分:5)

以下是Apple的说明&将GL视图快照到UIImage的官方代码(考虑到视网膜显示,翻转的坐标等),我一直在成功使用它。当然,这不是特定于iSGL3D的,但只要您能够获得正确的上下文和帧缓冲区来绑定,它就应该做正确的事情。 (如页面所述,您必须确保在调用-presentRenderbuffer:之前拍摄快照,以便渲染缓冲区有效。)

https://developer.apple.com/library/ios/#qa/qa1704/_index.html

我对iSGL3D库只有一个粗略的熟悉,它看起来不像有明显的钩子让你渲染场景但不呈现它(或首先渲染到屏幕外缓冲区)。您可能需要进行干预的位置是您正在使用的-finalizeRender子类的Isgl3dGLContext方法,就在调用-presentRenderbuffer之前。这个上下文在这里是一个内部框架类,所以你可能需要在库中稍微改变一下来设置(比如说)一个代理来自视图和导演的上下文,最终要求你的应用程序在“现在”呼叫之前采取任何行动,在此期间您可以选择运行屏幕截图代码(如果您愿意),或者如果您不想这样做则不执行任何操作。

答案 1 :(得分:3)

这是你想要的吗?

这将从当前上下文和帧缓冲区获取屏幕截图,并将其保存到相册中。

如果您不想保存到相册,请在最后获得最终的UIImage。

还记得在完成绘图后,但在切换缓冲区之前调用它。

此外,如果您正在使用MSAA,则必须在glResolveMultisampleFramebufferAPPLE和新缓冲区绑定后调用它。

#ifdef AUTOSCREENSHOT

// callback for CGDataProviderCreateWithData
void releaseData(void *info, const void *data, size_t dataSize) {
    free((void*)data);
}

// callback for UIImageWriteToSavedPhotosAlbum
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo {
    NSLog(@"Save finished");
    [image release];
}

-(void)saveCurrentScreenToPhotoAlbum {
    int height = (int)screenSize.y*retina;
    int width = (int)screenSize.x*retina;

    NSInteger myDataLength = width * height * 4;
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    for(int y = 0; y <height; y++) {
        for(int x = 0; x < width * 4; x++) {
            buffer2[(int)((height - 1 - y) * width * 4 + x)] = buffer[(int)(y * 4 * width + x)];
        }
    }
    free(buffer);   

    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, releaseData);
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    CGColorSpaceRelease(colorSpaceRef);
    CGDataProviderRelease(provider);

    UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
    CGImageRelease(imageRef);

    UIImageWriteToSavedPhotosAlbum(image, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
}

#endif

我在播放时使用此代码来保存定时屏幕截图,因此我可以将神材料放入应用程序商店。

答案 2 :(得分:3)

我在我的某个应用中成功使用此代码段来执行OpenGL屏幕截图。

enum {
  red,
  green,
  blue,
  alpha
};

- (UIImage *)glToUIImage {
  CGSize glSize = self.glView.bounds.size;
  NSInteger bufDataLen = glSize.width * glSize.height * 4;

  // Allocate array and read pixels into it.
  GLubyte *buffer = (GLubyte *)malloc(bufDataLen);
  glReadPixels(0, 0, glSize.width, glSize.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

  // We need to flip the image
  NSUInteger maxRow = (NSInteger)glSize.height - 1;
  NSUInteger bytesPerRow = (NSInteger)glSize.width * 4;

  GLubyte *buffer2 = (GLubyte *)malloc(bufDataLen);
  for(int y = maxRow; y >= 0; y--) {
    for(int x = 0; x < bytesPerRow; x+=4) {
      NSUInteger c0 = y * bytesPerRow + x;
      NSUInteger c1 = (maxRow - y) * bytesPerRow + x;
      buffer2[c0+red] = buffer[c1+red];
      buffer2[c0+green] = buffer[c1+green];
      buffer2[c0+blue] = buffer[c1+blue];
      buffer2[c0+alpha] = buffer[c1+alpha];
    }
  }
  free(buffer);

  // Make data provider with data
  CFDataRef imageData = CFDataCreate(NULL, buffer2, bufDataLen);
  free(buffer2);

  CGDataProviderRef provider = CGDataProviderCreateWithCFData(imageData);
  CFRelease(imageData);

  // Bitmap format
  int bitsPerComponent = 8;
  int bitsPerPixel = 32;
  CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
  CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast;
  CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

  // Create the CGImage
  CGImageRef imageRef = CGImageCreate(glSize.width,
                                      glSize.height,
                                      bitsPerComponent,
                                      bitsPerPixel,
                                      bytesPerRow,
                                      colorSpaceRef,
                                      bitmapInfo,
                                      provider,
                                      NULL,
                                      NO,
                                      renderingIntent);

  // Clean up
  CGColorSpaceRelease(colorSpaceRef);
  CGDataProviderRelease(provider);

  // Convert to UIImage
  UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
  CGImageRelease(imageRef);

  return [image autorelease];
}

确保在执行此操作之前绑定帧缓冲区,如此

glBindFramebufferOES(GL_FRAMEBUFFER_OES, myFrameBuffer);
glViewport(0, 0, myBackingWidth, myBackingHeight);

在呈现帧缓冲区之前调用-glToUIImage

有关详细信息,Apple提供了sample code从OpenGL截取屏幕截图。

答案 3 :(得分:2)

我想出了这个可能的解决方案。你必须修改一下isgl3d库。

步骤如下:

1

为Isgl3dGLContext1创建委托:

在Isgl3dGLContext1.h

@protocol ScreenShooterDelegate;

#import <OpenGLES/ES1/gl.h>
#import <OpenGLES/ES1/glext.h>
#import "Isgl3dGLContext.h"

@interface Isgl3dGLContext1 : Isgl3dGLContext {

    NSObject<ScreenShooterDelegate>* __unsafe_unretained delegate;

    GLuint _colorRenderBuffer;
@private
    EAGLContext * _context;

    // The OpenGL names for the framebuffer and renderbuffer used to render to this view
    GLuint _defaultFrameBuffer;


    GLuint _depthAndStencilRenderBuffer;
    GLuint _depthRenderBuffer;
    GLuint _stencilRenderBuffer;

    // OpenGL MSAA buffers
    GLuint _msaaFrameBuffer;
    GLuint _msaaColorRenderBuffer;

    GLuint _msaaDepthAndStencilRenderBuffer;
    GLuint _msaaDepthRenderBuffer;
    GLuint _msaaStencilRenderBuffer;
}

- (id) initWithLayer:(CAEAGLLayer *) layer;
@property (assign) NSObject<ScreenShooterDelegate>* delegate;
@property  BOOL takePicture;
@property  GLuint colorRenderBuffer;

@end

@protocol ScreenShooterDelegate


@optional

- (void)takePicture;

@end

2

。将此代码添加到Isgl3dGLContext1.m:

@synthesize takePicture;
@synthesize colorRenderBuffer = _colorRenderBuffer;

行前[_context presentRenderbuffer:GL_RENDERBUFFER_OES]; in - (void)finalizeRender:

if(takePicture){
        takePicture=NO;
        if([delegate respondsToSelector:@selector(takePicture)]){
            [delegate takePicture];
        }
    }

3将此代码放在要进行屏幕截图的类中:

In Class.h add <ScreenShooterDelegate>

方法Class.m

[Isgl3dDirector sharedInstance].antiAliasingEnabled = NO;

Photos3DAppDelegate *appDelegate = (Photos3DAppDelegate *)[[UIApplication sharedApplication] delegate];
[appDelegate.inOutSceneView showSphere];

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView];
Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext];
_glContext.delegate=self;
_glContext.takePicture=YES;

在方法中 - (void)takePicture {}放置来自Apple的代码并在方法的末尾添加[Isgl3dDirector sharedInstance] .antiAliasingEnabled = YES; (如果你使用它的话)

//https://developer.apple.com/library/ios/#qa/qa1704/_index.html

-(void)takePicture{


NSLog(@"Creating Foto");

GLint backingWidth, backingHeight;

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView];
//Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext];
//glBindRenderbufferOES(GL_RENDERBUFFER_OES, _glContext.colorRenderBuffer);

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints, heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor
    // so that you get a high-resolution snapshot when its value is greater than 1.0
    CGFloat scale = eaglview.contentScaleFactor;
    widthInPoints = width / scale;
    heightInPoints = height / scale;
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    widthInPoints = width;
    heightInPoints = height;
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}

CGContextRef cgcontext = UIGraphicsGetCurrentContext();

// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

// Retrieve the UIImage from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

// Clean up
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);

UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);

[Isgl3dDirector sharedInstance].antiAliasingEnabled = YES;
}

注意:对我来说只是评论glBindRenderbufferOES(GL_RENDERBUFFER_OES,_colorRenderbuffer);和 在您的情况下,您可以使用Isgl3dGLContext2而不是Isgl3dGLContext1执行这些步骤。