这段代码大部分都有效,但结果数据似乎松散了一个颜色通道(正如我所想),因为显示的结果图像数据是蓝色的!
以下是代码:
UIImage* myImage=[UIImage imageNamed:@"sample1.png"];
CGImageRef imageRef=[myImage CGImage];
CVImageBufferRef pixelBuffer = [self pixelBufferFromCGImage:imageRef];
方法pixelBufferFromCGIImage是从stackoverflow上的另一个帖子中抓取的:How do I export UIImage array as a movie?(尽管这个应用程序与我想要做的无关)它是
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
NSDictionary *options = @{
(__bridge NSString *)kCVPixelBufferCGImageCompatibilityKey: @(NO),
(__bridge NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @(NO)
};
CVPixelBufferRef pixelBuffer;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pixelBuffer);
if (status != kCVReturnSuccess) {
return NULL;
}
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *data = CVPixelBufferGetBaseAddress(pixelBuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(data, frameSize.width, frameSize.height,
8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace,
(CGBitmapInfo) kCGImageAlphaNoneSkipLast);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return pixelBuffer;
}
我认为它与kCVPixelFormatType_32ARGB和kCGImageAlphaNoneSkipLast之间的关系有关,虽然我已尝试过每个组合并获得相同的结果或应用程序崩溃。再次,这将UIImage数据转换为CVImageBufferRef但是当我在屏幕上显示图像时,它似乎松散了一个颜色通道并显示有色蓝色。图像是png。
答案 0 :(得分:5)
解决方案是此代码完全按预期工作。 :)问题在于使用数据创建OpenGL纹理。与此代码完全无关。任何搜索如何将UIImage转换为CVImageBufferRef的人,您的答案都在上面的代码中!
答案 1 :(得分:1)
如果有人还在寻找这个问题的解决方案,我通过切换pixelBuffer选项中的BOOL来解决它:
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:NO], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
从NO到YES:
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
答案 2 :(得分:1)
我遇到同样的问题并找到一些样品:http://www.cakesolutions.net/teamblogs/2014/03/08/cmsamplebufferref-from-cgimageref
试着改变
CGBitmapInfo bitmapInfo = (CGBitmapInfo)kCGBitmapByteOrder32Little |
kCGImageAlphaPremultipliedFirst)
答案 3 :(得分:1)
这是真正有用的:
+ (CVPixelBufferRef)pixelBufferFromImage:(CGImageRef)image {
CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); // Not sure why this is even necessary, using CGImageGetWidth/Height in status/context seems to work fine too
CVPixelBufferRef pixelBuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32BGRA, nil, &pixelBuffer);
if (status != kCVReturnSuccess) {
return NULL;
}
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *data = CVPixelBufferGetBaseAddress(pixelBuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace, (CGBitmapInfo) kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return pixelBuffer;
}
您可以将像素缓冲区更改回UIImage(然后显示或保存)以确认它适用于此方法:
+ (UIImage *)imageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer {
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef myImage = [context createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))];
UIImage *image = [UIImage imageWithCGImage:myImage];
// Uncomment the following lines to say the image to your application's document directory
//NSString *imageSavePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"myImageFromPixelBuffer.png"]];
//[UIImagePNGRepresentation(image) writeToFile:imageSavePath atomically:YES];
return image;
}
答案 4 :(得分:0)
只是为了澄清上面的答案:我遇到了同样的问题,因为我的着色器代码期望图像缓冲区中有两个分层样本,而我使用单层缓冲区
这一行从一个样本中获取rgb值并将它们传递给(我不知道是什么),但最终结果是全彩色图像。
gl_FragColor = vec4(texture2D(SamplerY, texCoordVarying).rgb, 1);
答案 5 :(得分:-1)
听起来可能是那种关系。可能有一个jpg和RGB而不是带有png的索引颜色?