我尝试使用UIActivityViewController创建截图并保存到iPhone / iPad设备中的照片中。但是,在模拟器中,一切都正确显示,但是当我切换到设备时,它只显示部分。这是截图:
我将所有这三个不同的UIImages合并为一个图像,以便我可以截取屏幕截图。
我首先将背景图像(桥UIImage)与星形图像合并。
-(UIImage*)mergeUIImageView:(UIImage*)bkgound
FrontPic:(UIImage*)fnt
FrontPicX:(CGFloat)xPos
FrontPicY:(CGFloat)yPos
FrontPicWidth:(CGFloat)picWidth
FrontPicHeight:(CGFloat)picHeight
FinalSize:(CGSize)finalSize
{
UIGraphicsBeginImageContext(CGSizeMake([UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width));
// bkgound - is the bridge image
[bkgound drawInRect:CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width)];
// fnt - is the star image
[fnt drawInRect:CGRectMake(xPos, yPos, picWidth, picHeight)];
// merged image
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
然后我将此图片与opengl渲染图片合并为绿线。
a)我首先使用此函数将opengGL图像更改为UIImage
-(UIImage *) glToUIImage {
float scaleFactor = [[UIScreen mainScreen] scale];
CGRect screen = [[UIScreen mainScreen] bounds];
CGFloat image_height = screen.size.width * scaleFactor;
CGFloat image_width = screen.size.height * scaleFactor;
NSInteger myDataLength = image_width * image_height * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, image_width, image_height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// gl renders "upside down" so swap top to bottom into new array.
// there's gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < image_height; y++)
{
for(int x = 0; x < image_width * 4; x++)
{
buffer2[(int)((image_height - 1 - y) * image_width * 4 + x)] = buffer[(int)(y * 4 * image_width + x)];
}
}
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * image_width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(image_width, image_height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
// then make the uiimage from that
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
return myImage;
}
b)然后我使用上面相同的函数将这个opengl图像与上面的图像(bridge + star)合并
-(UIImage*)screenshot
{
// get opengl image from above function
UIImage *image = [self glToUIImage];
CGRect pos = CGRectMake(0, 0, image.size.width, image.size.height);
UIGraphicsBeginImageContext(image.size);
[image drawInRect:pos];
[self.background.image drawInRect:pos];
UIImage* final = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return final;
}
它适用于(iPhone,iPad,带视网膜的iPhone和带视网膜的iPad)模拟器(6.0版)。但是,当我切换到真实设备(iPhone 4 / 4s / 5,iPad(2 /迷你/视网膜))时,它只显示星形图像。 xcode版本是4.6.3,基本SDK是最新的IOS(IOS 6.1),IOS部署目标是5.0。你们能告诉我如何解决这个问题吗?感谢。
答案 0 :(得分:0)
问题是IOS 6.0不会一直保留缓冲区,它会将其删除。但是,当您进行屏幕截图时,您将从缓冲区获取数据,这就是我不断获得黑色背景的原因。所以添加类别让设备保持缓冲区图像代替将解决这个问题。
@interface CAEAGLLayer (Retained)
@end
@implementation CAEAGLLayer (Retained)
- (NSDictionary*) drawableProperties
{
return @{kEAGLDrawablePropertyRetainedBacking : @(YES)};
}
@end