如何有效地确定最常用的颜色?

时间:2014-12-15 02:39:50

标签: objective-c macos cocoa

我需要从当前显示的屏幕确定NSColor占主导地位(当前位图调色板中的最高数量)...我构建了一些有效但速度非常慢的东西......我需要每次执行大约1次第二个(它目前需要6秒多的时间来处理),我希望它不会占用CPU(目前就是这种情况)。

杀死它的部分是分析每个像素的2个嵌套循环(宽度x高度)。有没有更有效的方法来做到这一点?我确定有......任何一个例子?

谢谢!

#include "ScreenCapture.h"
#import <AVFoundation/AVFoundation.h>

@implementation ScreenCapture

@synthesize captureSession;
@synthesize stillImageOutput;
@synthesize stillImage;

//-----------------------------------------------------------------------------------------------------------------
- (id) init
{
    if ((self = [super init]))
        [self setCaptureSession:[[AVCaptureSession alloc] init]];

    // main screen input

    CGDirectDisplayID displayId = kCGDirectMainDisplay;
    AVCaptureScreenInput *input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
    [input setMinFrameDuration:CMTimeMake(1, 1)];
    input.capturesCursor = 0;
    input.capturesMouseClicks = 0;

    if ([[self captureSession] canAddInput:input])
        [[self captureSession] addInput:input];

    // still image output

    [self setStillImageOutput:[[AVCaptureStillImageOutput alloc] init]];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
    [[self stillImageOutput] setOutputSettings:outputSettings];

    if ([[self captureSession] canAddOutput:[self stillImageOutput]])
         [[self captureSession] addOutput:[self stillImageOutput]];

    // start capturing

    [[self captureSession] startRunning];

    return self;
}

//-----------------------------------------------------------------------------------------------------------------
- (NSColor* ) currentlyDominantColor
{
    [self captureImage];

    if ([self stillImage] != nil)
    {
        NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithCIImage:[self stillImage]];

        NSInteger pixelsWide = [imageRep pixelsWide];
        NSInteger pixelsHigh = [imageRep pixelsHigh];
        NSCountedSet* imageColors = [[NSCountedSet alloc] initWithCapacity:pixelsWide * pixelsHigh];

        NSColor* dominantColor = nil;
        NSUInteger highCount = 0;

        for (NSUInteger x = 0; x < pixelsWide; x++)
        {
            for (NSUInteger y = 0; y < pixelsHigh; y++)
            {
                NSColor* color = [imageRep colorAtX:x y:y];
                [imageColors addObject:color];

                NSUInteger count = [imageColors countForObject:color];
                if (count > highCount)
                {
                    dominantColor = color;
                    highCount = count;
                }
            }
        }

        return dominantColor;
    }
    else
    {
        // dummy random color until an actual color gets computed

        double r1 = ((double) arc4random() / 0x100000000);
        double r2 = ((double) arc4random() / 0x100000000);
        double r3 = ((double) arc4random() / 0x100000000);

        return [NSColor colorWithCalibratedRed:r1 green:r2 blue:r3 alpha:1.0f];
    }
}

//-----------------------------------------------------------------------------------------------------------------
- (void) captureImage
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [[self stillImageOutput] connections])
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo])
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
            break;
    }

    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
                                                         completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        CIImage *image = [[CIImage alloc] initWithData:imageData];
        [self setStillImage:image];
    }];
}

//-----------------------------------------------------------------------------------------------------------------
- (void) dealloc
{
    [[self captureSession] stopRunning];

    captureSession = nil;
    stillImageOutput = nil;
    stillImage = nil;
}

@end

3 个答案:

答案 0 :(得分:2)

这是一个快得多的算法的概要。代码中的大部分缓慢来自对colorAtX:y的所有调用 - 包括获取像素,创建NSColor等等。(分析您的应用以查找),它们都使用消息调度。如果直接访问位图数据,可以做得更好。

例如,让我们假设您的位图已经过网格化(使用isPlanar查找)并且具有32位像素(bitsPerPixel),您可以调整其他位图。

  1. 检查上述条件
  2. 获取指向像素的指针(bitmapData) - 这实际上是uint32像素的C数组,其长度是像素数(totalBytes / 4)
  3. 对像素进行排序(例如使用qsort),这将为您提供相同像素值的运行 - 是的,它会影响您的图像,但是谁在乎您为此目的创建它
  4. 在数组上循环并找到运行时间最长的像素值 - 您只是在寻找相同uint32值的运行,这是一个简单的算法
  5. 在循环之后使用NSColor创建colorWithColorSpace:components:count - 通过从像素中提取每个字节来获取位图(colorSpace)中的颜色空间和浮点值(shift&amp; mask并转换为0到1范围内的浮点数。
  6. HTH

答案 1 :(得分:0)

考虑使用CIFilter的CIAreaAverage。它比普通凡人更了解高速数学运算!

答案 2 :(得分:0)

此代码并不完全符合您的要求,但如果您没有以这种方式获取像素值,则您获得的像素值将不准确。我不知道为什么。

无论如何,这是对一系列其他问题的回答:获取图像指标,特别是最小值,平均值和最大值。请注意我是如何获得像素值的。你需要这样做。您将对代码进行的唯一更改是添加一个循环,根据高度和宽度迭代每个像素(这里只需要一个基本的循环)。

这是我的输出......

  

2015-07-17 14:58:03.751色度照片编辑扩展[1945:155358]   CIAreaMinimum输出:255,27,0,0

     

2015-07-17 15:00:08.086 Chroma Photo Editing Extension [2156:157963]   CIAreaAverage输出:255,191,166,155

     

2015-07-17 15:01:24.047色度照片编辑扩展[2253:159246]   CIAreaMaximum输出:255,255,255,238

...来自以下代码(适用于iOS):

- (CIImage *)outputImage
{
    [GlobalCIImage sharedSingleton].ciImage = self.inputImage;
    
    CGRect inputExtent = [[GlobalCIImage sharedSingleton].ciImage extent];
    CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                           Y:inputExtent.origin.y
                                           Z:inputExtent.size.width
                                           W:inputExtent.size.height];
    CIImage *inputAverage = [CIFilter filterWithName:@"CIAreaMaximum" keysAndValues:kCIInputImageKey, [GlobalCIImage sharedSingleton].ciImage, kCIInputExtentKey, extent, nil].outputImage;
    size_t rowBytes = 4;
    uint8_t byteBuffer[rowBytes];
    
    [[GlobalContext sharedSingleton].ciContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil];
    
    int width = inputAverage.extent.size.width;
    int height = inputAverage.extent.size.height;
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedFirst);
    
    CGColorSpaceRelease(colorSpace);
    
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), [[GlobalContext sharedSingleton].ciContext createCGImage:inputAverage fromRect:CGRectMake(0, 0, width, height)]);
    
    unsigned int *colorData = CGBitmapContextGetData(context);
    unsigned int color = *colorData;
    
    float inputRed = 0.0;
    float inputGreen = 0.0;
    float inputBlue = 0.0;
    short a = color & 0xFF;
    short r = (color >> 8) & 0xFF;
    short g = (color >> 16) & 0xFF;
    short b = (color >> 24) & 0xFF;
    NSLog(@"CIAreaMaximum output: %d, %d, %d, %d", a, r, g, b);
        
    *colorData = (unsigned int)(r << 8) + ((unsigned int)(g) << 16) + ((unsigned int)(b) << 24) + ((unsigned int)(a));
    //NSLog(@"Second read: %i", colorData);
        
    inputRed = r / 255.0;
    inputGreen = g / 255.0;
    inputBlue = b / 255.0;
    
    CGContextRelease(context);
    
    return [[self dissimilarityKernel] applyWithExtent:[GlobalCIImage sharedSingleton].ciImage.extent roiCallback:^CGRect(int index, CGRect rect) {
        return CGRectMake(0, 0, CGRectGetWidth([GlobalCIImage sharedSingleton].ciImage.extent), CGRectGetHeight([GlobalCIImage sharedSingleton].ciImage.extent));
    } arguments:@[[GlobalCIImage sharedSingleton].ciImage, [NSNumber numberWithFloat:inputRed], [NSNumber numberWithFloat:inputGreen], [NSNumber numberWithFloat:inputBlue]]];
}