我的iOS6应用程序中有第三方游戏对象,我无法访问其实现,现在我需要截取其视图的截图。
游戏对象有一个公共方法,允许我将其添加到我的视图中:
-initWithParentView:andController:
我设法通过以下方式抓住了视图:
self.gameview = [self.myView.subviews firstObject]; //there is only one subview
self.gameview是CCGLView类型。所以我扩展了CCGLView以按照Apple Doc(http://nathanmock.com/files/com.apple.adc.documentation.AppleiOS6.0.iOSLibrary.docset/Contents/Resources/Documents/#qa/qa1704/_index.html)
中的建议添加快照方法- (UIImage*)snapshot
{
// Get the size of the backing CAEAGLLayer
GLint backingWidth, backingHeight;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
NSInteger x = 0;
NSInteger y = 0;
NSInteger width = backingWidth;
NSInteger height = backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(
width,
height,
8,
32,
width * 4,
colorspace,
kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast,
ref, NULL, true, kCGRenderingIntentDefault);
// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints;
NSInteger heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
// On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
// Set the scale parameter to your OpenGL ES view's contentScaleFactor
// so that you get a high-resolution snapshot when its value is greater than 1.0
CGFloat scale = self.contentScaleFactor;
widthInPoints = width / scale;
heightInPoints = height / scale;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
// On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
widthInPoints = width;
heightInPoints = height;
UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}
CGContextRef cgcontext = UIGraphicsGetCurrentContext();
// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);
// Retrieve the UIImage from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil );
// Clean up
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);
return image;
}
我还处理帧缓冲区,如Apple的EAGLView库中所示:
- (void)setFramebuffer
{
if (self.context)
{
[EAGLContext setCurrentContext:self.context];
if (!defaultFramebuffer)
[self createFramebuffer];
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
glViewport(0, 0, framebufferWidth, framebufferHeight);
}
}
- (void)createFramebuffer
{
if (self.context && !defaultFramebuffer)
{
[EAGLContext setCurrentContext:self.context];
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
NSUInteger result = glCheckFramebufferStatus(GL_FRAMEBUFFER);
NSLog(@"%d",result); //this returns value 36055
// Create color render buffer and allocate backing store.
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
CAEAGLLayer *glLayer = (CAEAGLLayer*) self.layer;
glLayer.drawableProperties = @{[NSNumber numberWithBool:NO]: kEAGLDrawablePropertyRetainedBacking,
kEAGLColorFormatRGBA8: kEAGLDrawablePropertyColorFormat};
glLayer.opaque = YES;
[self.context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}
- (BOOL)presentFramebuffer
{
BOOL success = FALSE;
if (self.context)
{
[EAGLContext setCurrentContext:self.context];
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
success = [self.context presentRenderbuffer:GL_RENDERBUFFER];
}
return success;
}
当它全部设置完毕后,我会执行以下操作来调用快照功能:
[self.gameview setFramebuffer];
UIImage *image = [self.gameview snapshot];
[self.gameview presentFramebuffer];
return image;
我得到的只是一张白色图片。
我对createFrameBuffer
方法特别感到怀疑,因为我自己没有初始化openGL视图。我试图从视图中获取当前帧缓冲区,但没有运气使用:glGetIntegerv(GL_FRAMEBUFFER, &defaultFramebuffer);
值得一提的是,在我使用snapshot
调用glDrawArrays
函数之前,每当我尝试绘制某些内容时,我确实设法捕获了屏幕的一部分。无论我如何尝试改变数组值,我都无法完全发挥作用。 (我对openGL不是很熟悉。)
如果有人能指出我在这里错过的任何内容,我将不胜感激。
答案 0 :(得分:0)
我找到了解决方案。原来我用openGL搞砸了整个事情。秘密在cocos2d的CCRenderTexture中。
#include <CCRenderTexture.h>
#include <CCScene.h>
CCScene *scene = [[CCDirector sharedDirector] runningScene];
CCNode *n = [scene.children objectAtIndex:0];
[CCDirector sharedDirector].nextDeltaTimeZero = YES;
CGSize winSize = [CCDirector sharedDirector].winSize;
CCRenderTexture *rtx = [CCRenderTexture renderTextureWithWidth:winSize.width height:winSize.height];
[rtx begin];
[n visit];
[rtx end];
UIImage *image = [rtx getUIImage];
return image;
代码在iOS6和iOS7中完美运行。