我有一个像相机这样的应用程序来捕获照片并对其应用滤镜。一切正常,唯一的问题是当我将滤镜应用于图像时,它会更改尺寸。我也调整了图像的大小,但是直到无法获得像原始图像一样的图像。我的代码是
let ciContext = CIContext(options: nil)
let coreimg = CIImage(image: AppDelegate.captured_iamge!)
let filter = CIFilter(name: CIFilterNames[indexPath.row])
filter!.setDefaults()
filter!.setValue(coreimg, forKey: kCIInputImageKey)
let filtered_img_data = filter!.value(forKey: kCIOutputImageKey) as! CIImage
let filtered_img_ref = ciContext.createCGImage(filtered_img_data, from: filtered_img_data.extent)
let img = UIImage(cgImage: filtered_img_ref!)
print("Filtered Image size is \(img.size)")
let size = img_view.image?.size
print("ImageView on which image is displaying size \(size!)")
let newimg = resizeImage(image: img, targetSize: size!)
print("Resized Image size \(newimg.size)")
img_view.image = newimg
调整图像大小的方法
func resizeImage (image: UIImage, targetSize: CGSize) -> UIImage {
let newSize : CGSize = targetSize let rect = CGRect (x: 0, y: 0, width: newSize.width, height: newSize.height)
UIGraphicsBeginImageContextWithOptions (newSize, false, 1.0)
image.draw (in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
捕获图像就像
当我运行放出的代码时,就像
但是当我显示图像时还是这样
解决此问题的任何建议