从上一个问题来看,
CIImage drawing EXC_BAD_ACCESS,
我通过告诉CIContext
进行软件渲染,学会了解决 CoreImage 问题。我现在正在试图找出当AppKit尝试绘制NSImageView
时发生的崩溃,我设置为使用以下代码显示CIImage
:
- (void)setCIImage:(CIImage *)processedImage;
{
NSSize size = [processedImage extent].size;
if (size.width == 0) {
[self setImage:nil];
return;
}
NSData * pixelData = [[OMFaceRecognizer defaultRecognizer] imagePlanarFData:processedImage];
LCDocument * document = [[[self window] windowController] document];
[[NSNotificationCenter defaultCenter] postNotificationName:LCCapturedImageNotification
object:document
userInfo:@{ @"data": pixelData, @"size": [NSValue valueWithSize:size] }];
#if 1
static dispatch_once_t onceToken;
static CGColorSpaceRef colorSpace;
static size_t bytesPerRow;
dispatch_once(&onceToken, ^ {
colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericGray);
bytesPerRow = size.width * sizeof(float);
});
// For whatever bizarre reason, CoreGraphics uses big-endian floats (!)
const float * data = [[[OMFaceRecognizer defaultRecognizer] byteswapPlanarFData:pixelData
swapInPlace:NO] bytes];
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, data, bytesPerRow * size.height, NULL);
CGImageRef renderedImage = CGImageCreate(size.width, size.height, 32, 32, bytesPerRow, colorSpace, kCGImageAlphaNone | kCGBitmapFloatComponents, provider, NULL, false, kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
NSImage * image = [[NSImage alloc] initWithCGImage:renderedImage
size:size];
CGImageRelease(renderedImage);
#else
NSCIImageRep * rep = [NSCIImageRep imageRepWithCIImage:processedImage];
NSImage * image = [[NSImage alloc] initWithSize:size];
[image addRepresentation:rep];
#endif
[self setImage:image];
}
有没有办法让NSImageView
使用软件渲染?我在 IB 中环顾四周,但我没有看到任何看起来很有希望的东西......