正确的CIGaussianBlur作物

时间:2012-10-11 12:38:02

标签: objective-c macos core-image

正如我注意到当CIGaussianBlur应用于图像时,图像的角落变得模糊,因此它看起来比原始的小。所以我发现我需要正确裁剪它以避免图像的透明边缘。但是如何计算我需要根据模糊量裁剪多少?


示例:

原始图片:
enter image description here

图像有50 inputRadius的CIGaussianBlur(蓝色是一切的背景):
enter image description here

9 个答案:

答案 0 :(得分:51)

以下面的代码为例......

CIContext *context = [CIContext contextWithOptions:nil];

CIImage *inputImage = [[CIImage alloc] initWithImage:image];

CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];

[filter setValue:inputImage forKey:kCIInputImageKey];

[filter setValue:[NSNumber numberWithFloat:5.0f] forKey:@"inputRadius"];

CIImage *result = [filter valueForKey:kCIOutputImageKey];

CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];

这会生成您在上面提供的图像。但是,如果我改为使用原始图像rect来创建上下文中的CGImage,则生成的图像是所需的大小。

CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];

答案 1 :(得分:24)

有两个问题。第一个是模糊滤镜对输入图像边缘之外的像素进行采样。这些像素是透明的。这就是透明像素的来源。 诀窍是在应用模糊滤镜之前扩展边缘。这可以通过钳位滤波器完成,例如,像这样:

CIFilter *affineClampFilter = [CIFilter filterWithName:@"CIAffineClamp"];

CGAffineTransform xform = CGAffineTransformMakeScale(1.0, 1.0);
[affineClampFilter setValue:[NSValue valueWithBytes:&xform
                                           objCType:@encode(CGAffineTransform)]
                     forKey:@"inputTransform"];

此滤镜无限延伸边缘并消除透明度。下一步是应用模糊过滤器。

第二个问题有点奇怪。某些渲染器会为模糊滤镜生成更大的输出图像,您必须通过某些偏移调整生成的CIImage的原点,例如:像这样:

CGImageRef cgImage = [context createCGImage:outputImage
                                   fromRect:CGRectOffset([inputImage extend],
                                                         offset, offset)];

我的iPhone上的软件渲染器需要三倍的模糊半径作为偏移。同一iPhone上的硬件渲染器根本不需要任何偏移。也许你可以从输入和输出图像的大小差异中推断出偏移,但我没有尝试......

答案 2 :(得分:12)

要获得具有硬边的图像的漂亮模糊版本,首先需要将CIAffineClamp应用于源图像,将其边缘向外扩展,然后您需要确保在生成输出图像时使用输入图像的范围。

代码如下:

CIContext *context = [CIContext contextWithOptions:nil];

UIImage *image = [UIImage imageNamed:@"Flower"];
CIImage *inputImage = [[CIImage alloc] initWithImage:image];

CIFilter *clampFilter = [CIFilter filterWithName:@"CIAffineClamp"];
[clampFilter setDefaults];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];

CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"];
[blurFilter setValue:clampFilter.outputImage forKey:kCIInputImageKey];
[blurFilter setValue:@10.0f forKey:@"inputRadius"];

CIImage *result = [blurFilter valueForKey:kCIOutputImageKey];

CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
UIImage result = [[UIImage alloc] initWithCGImage:cgImage scale:image.scale orientation:UIImageOrientationUp];

CGImageRelease(cgImage);
  

请注意,此代码已在iOS上测试过。对于OS X应该是类似的(用NSImage代替UIImage)。

答案 3 :(得分:5)

这对我有用:)

CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [[CIImage alloc] initWithImage:image];
CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"];
[blurFilter setDefaults];
[blurFilter setValue:inputImage forKey:@"inputImage"];
CGFloat blurLevel = 20.0f;          // Set blur level
[blurFilter setValue:[NSNumber numberWithFloat:blurLevel] forKey:@"inputRadius"];    // set value for blur level
CIImage *outputImage = [blurFilter valueForKey:@"outputImage"];
CGRect rect = inputImage.extent;    // Create Rect
rect.origin.x += blurLevel;         // and set custom params
rect.origin.y += blurLevel;         // 
rect.size.height -= blurLevel*2.0f; //
rect.size.width -= blurLevel*2.0f;  //
CGImageRef cgImage = [context createCGImage:outputImage fromRect:rect];    // Then apply new rect
imageView.image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);

答案 4 :(得分:2)

我看到了一些解决方案,并希望根据此处分享的一些想法推荐一种更现代的解决方案:

private lazy var coreImageContext = CIContext() // Re-use this.

func blurredImage(image: CIImage, radius: CGFloat) -> CGImage? {
    let blurredImage = image
        .clampedToExtent()
        .applyingFilter(
            "CIGaussianBlur",
            parameters: [
                kCIInputRadiusKey: radius,
            ]
        )
        .cropped(to: image.extent)

    return coreImageContext.createCGImage(blurredImage, from: blurredImage.extent)
}

如果以后需要UIImage,当然可以这样获得

let image = UIImage(cgImage: cgImage)

...对于那些好奇的人,返回CGImage的原因是(如Apple documentation中所述):

  

由于Core Image的坐标系与UIKit不匹配,当在带有{content1}的UIImageView中显示时,此过滤方法可能会产生意外结果。确保使用CGImage支持它,以便它正确处理contentMode。

如果您需要CIImage,可以将其退回,但是在这种情况下,如果要显示图像,则可能要小心。

答案 5 :(得分:0)

请参阅以下两个 Xamarin (C#)的实现。

1)适用于iOS 6

public static UIImage Blur(UIImage image)
{   
    using(var blur = new CIGaussianBlur())
    {
        blur.Image = new CIImage(image);
        blur.Radius = 6.5f;

        using(CIImage output = blur.OutputImage)
        using(CIContext context = CIContext.FromOptions(null))
        using(CGImage cgimage = context.CreateCGImage (output, new RectangleF(0, 0, image.Size.Width, image.Size.Height)))
        {
            return UIImage.FromImage(cgimage);
        }
    }
}

2)iOS 7的实施

使用上面显示的方式在iOS 7上无法正常工作(至少目前使用Xamarin 7.0.1)。所以我决定以另一种方式添加裁剪(度量可能取决于模糊半径)。

private static UIImage BlurImage(UIImage image)
{   
    using(var blur = new CIGaussianBlur())
    {
        blur.Image = new CIImage(image);
        blur.Radius = 6.5f;

        using(CIImage output = blur.OutputImage)
        using(CIContext context = CIContext.FromOptions(null))
        using(CGImage cgimage = context.CreateCGImage (output, new RectangleF(0, 0, image.Size.Width, image.Size.Height)))
        {
            return UIImage.FromImage(Crop(CIImage.FromCGImage(cgimage), image.Size.Width, image.Size.Height));
        }
    }
}

private static CIImage Crop(CIImage image, float width, float height)
{
    var crop = new CICrop
    { 
        Image = image,
        Rectangle = new CIVector(10, 10, width - 20, height - 20) 
    };

    return crop.OutputImage;   
}

答案 6 :(得分:0)

这是Swift版本:

func applyBlurEffect(image: UIImage) -> UIImage {
    let context = CIContext(options: nil)
    let imageToBlur = CIImage(image: image)
    let blurfilter = CIFilter(name: "CIGaussianBlur")
    blurfilter!.setValue(imageToBlur, forKey: "inputImage")
    blurfilter!.setValue(5.0, forKey: "inputRadius")
    let resultImage = blurfilter!.valueForKey("outputImage") as! CIImage
    let cgImage = context.createCGImage(resultImage, fromRect: resultImage.extent)
    let blurredImage = UIImage(CGImage: cgImage)
    return blurredImage

}

答案 7 :(得分:0)

试试这个,让输入的范围为-createCGImage:fromRect:的参数:

-(UIImage *)gaussianBlurImageWithRadius:(CGFloat)radius {
    CIContext *context = [CIContext contextWithOptions:nil];
    CIImage *input = [CIImage imageWithCGImage:self.CGImage];
    CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [filter setValue:input forKey:kCIInputImageKey];
    [filter setValue:@(radius) forKey:kCIInputRadiusKey];
    CIImage *output = [filter valueForKey:kCIOutputImageKey];
    CGImageRef imgRef = [context createCGImage:output
                                      fromRect:input.extent];
    UIImage *outImage = [UIImage imageWithCGImage:imgRef
                                            scale:UIScreen.mainScreen.scale
                                      orientation:UIImageOrientationUp];
    CGImageRelease(imgRef);
    return outImage;
}

答案 8 :(得分:0)

这是使图像模糊的Swift 5版本。将“钳位”过滤器设置为默认值,这样就无需进行变换。

func applyBlurEffect() -> UIImage? {

    let context = CIContext(options: nil)
    let imageToBlur = CIImage(image: self)
    let clampFilter = CIFilter(name: "CIAffineClamp")!
    clampFilter.setDefaults()
    clampFilter.setValue(imageToBlur, forKey: kCIInputImageKey)

    //The CIAffineClamp filter is setting your extent as infinite, which then confounds your context. Try saving off the pre-clamp extent CGRect, and then supplying that to the context initializer.
    let inputImageExtent = imageToBlur!.extent

    guard let currentFilter = CIFilter(name: "CIGaussianBlur") else {
        return nil
    }
    currentFilter.setValue(clampFilter.outputImage, forKey: kCIInputImageKey)
    currentFilter.setValue(10, forKey: "inputRadius")
    guard let output = currentFilter.outputImage, let cgimg = context.createCGImage(output, from: inputImageExtent) else {
        return nil
    }
    return UIImage(cgImage: cgimg)

}