CIAreaMaximum和CIAreaMinimum返回不正确的结果或陈旧的结果

时间:2015-04-08 15:40:27

标签: macos core-graphics core-image

Gray scale image

当我在附加的灰度图像上应用CIAreaMaximum或CIAreaMinimum过滤器时。他们回归0.824和0.411。但是,如果我自己读取基础数据,则min为0.34(86/255),最大值为0.78(200/255)。过滤器代码如下所示:

CIVector *extentRect = [CIVector vectorWithCGRect:image.extent];
CIFilter *maxFilter = [CIFilter filterWithName:@"CIAreaMaximum" ];
[maxFilter setValue:image forKey:kCIInputImageKey];
[maxFilter setValue:extentRect forKey:kCIInputExtentKey];
NSBitmapImageRep *bitmap = [[NSBitmapImageRep alloc] initWithCIImage:[maxFilter outputImage]];
NSColor *color = [bitmap colorAtX:0 y:0];
return color.redComponent; // Since this is gray scale image, any component would do.

这是我用来将CIImage绘制到位图缓冲区的代码,以便我可以检查单个像素值:

// create buffer, colorspace and context
    int pixelsWide = CGImageGetWidth(image);
    int pixelsHigh = CGImageGetHeight(image);
    CGColorSpaceRef colorSpace;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;

    bitmapBytesPerRow   = (CGImageGetWidth(image) * 4);// 1
    bitmapByteCount     = (bitmapBytesPerRow * CGImageGetHeight(image));

    colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);// 2
    bitmapData = malloc( bitmapByteCount );// 3
    if (bitmapData == NULL)
    {
        fprintf (stderr, "Memory not allocated!");
    }
    bitmapContext = CGBitmapContextCreate (bitmapData,// 4
                                     pixelsWide,
                                     pixelsHigh,
                                     8,      // bits per component
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaPremultipliedLast);
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], kCIContextUseSoftwareRenderer, nil];
    bitmapCIContext = [CIContext contextWithCGContext:bitmapContext options:nil];

  // later on use CIContext to draw the ciImage to the bitmap buffer.
  // Here the extent is the entire image
    [bitmapCIContext drawImage:ciImage inRect:ciImage.extent fromRect:ciImage.extent];

    int min = 255;
    int max = 0;
    for (int y = 0; y < image.extent.size.height; ++y) {
        for (int x = 0; x < image.extent.size.width; ++x) {
            int index = (y * image.extent.size.width + x) * 4;
            unsigned char value = bitmapData[index];
            if (value < min)
                min = value;
            else if (value > max)
                max = value;
        }
    }
    NSLog(@"min = %d (%f), max = %d (%f)", min, (float)min / 255, max, (float)max/255);

我想知道我是否错误地使用了过滤器。

我在这两个过滤器中遇到了另一个问题。当最大滤波器的输入是图像累加器的输出(或累加器与另一个CIImage的组合)时,有时最大滤波器的结果显然不是累加器中的最新数据。一些关于解决这个的指针将不胜感激。

2 个答案:

答案 0 :(得分:1)

我的猜测是您看到了意外的值,因为Core Image是默认的颜色管理。 Core Image会将输入图像从其色彩空间转换为CI的工作空间,并计算该空间中的最小值/最大值。然后,当您渲染1x1像素结果图像以获得结果时,CI将从工作空间到目标上下文的输出颜色空间进行颜色匹配。

您可以通过使用选项kCIContextWorkingColorSpace创建CIContext来关闭所有这些:[NSNull null]。

答案 1 :(得分:1)

这是唯一可以正确有效地为你工作的东西......

这是我的输出......

  

2015-07-17 14:58:03.751色度照片编辑扩展[1945:155358]   CIAreaMinimum输出:255,27,0,0

     

2015-07-17 15:00:08.086 Chroma Photo Editing Extension [2156:157963]   CIAreaAverage输出:255,191,166,155

     

2015-07-17 15:01:24.047色度照片编辑扩展[2253:159246]   CIAreaMaximum输出:255,255,255,238

...来自以下代码(适用于iOS):

- (CIImage *)outputImage
{
    [GlobalCIImage sharedSingleton].ciImage = self.inputImage;
    
    CGRect inputExtent = [[GlobalCIImage sharedSingleton].ciImage extent];
    CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                           Y:inputExtent.origin.y
                                           Z:inputExtent.size.width
                                           W:inputExtent.size.height];
    CIImage *inputAverage = [CIFilter filterWithName:@"CIAreaMaximum" keysAndValues:kCIInputImageKey, [GlobalCIImage sharedSingleton].ciImage, kCIInputExtentKey, extent, nil].outputImage;
    size_t rowBytes = 4;
    uint8_t byteBuffer[rowBytes];
    
    [[GlobalContext sharedSingleton].ciContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil];
    
    int width = inputAverage.extent.size.width;
    int height = inputAverage.extent.size.height;
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedFirst);
    
    CGColorSpaceRelease(colorSpace);
    
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), [[GlobalContext sharedSingleton].ciContext createCGImage:inputAverage fromRect:CGRectMake(0, 0, width, height)]);
    
    unsigned int *colorData = CGBitmapContextGetData(context);
    unsigned int color = *colorData;
    
    float inputRed = 0.0;
    float inputGreen = 0.0;
    float inputBlue = 0.0;
    short a = color & 0xFF;
    short r = (color >> 8) & 0xFF;
    short g = (color >> 16) & 0xFF;
    short b = (color >> 24) & 0xFF;
    NSLog(@"CIAreaMaximum output: %d, %d, %d, %d", a, r, g, b);
        
    *colorData = (unsigned int)(r << 8) + ((unsigned int)(g) << 16) + ((unsigned int)(b) << 24) + ((unsigned int)(a));
    //NSLog(@"Second read: %i", colorData);
        
    inputRed = r / 255.0;
    inputGreen = g / 255.0;
    inputBlue = b / 255.0;
    
    CGContextRelease(context);
    
    return [[self dissimilarityKernel] applyWithExtent:[GlobalCIImage sharedSingleton].ciImage.extent roiCallback:^CGRect(int index, CGRect rect) {
        return CGRectMake(0, 0, CGRectGetWidth([GlobalCIImage sharedSingleton].ciImage.extent), CGRectGetHeight([GlobalCIImage sharedSingleton].ciImage.extent));
    } arguments:@[[GlobalCIImage sharedSingleton].ciImage, [NSNumber numberWithFloat:inputRed], [NSNumber numberWithFloat:inputGreen], [NSNumber numberWithFloat:inputBlue]]];
}