我正在创建一个具有UIWebView的应用程序以及单个视图控制器上的按钮。单击该按钮时,将使用UIGraphicsContext捕获(UIWebView的)图像。
这部分效果很棒!但是当单击按钮时,在捕获图像后,它会在同一视图上将图像显示为子视图,并且我一直在尝试使用ImageCropper库在屏幕上的UIImageView上通过提交在另一个子视图中绘制CGRect按钮。可以调整矩形本身的大小(拖动角/边)并在视图周围移动。
单击提交按钮时,屏幕左上角会显示另一个子视图,并显示已裁剪的图像(单击提交按钮后)。想法是仅捕获矩形内部的内容。我能够使代码工作,但捕获的图像是相同的图像,但不是CGRect内的部分。
我有3张图片显示它是如何工作的,并显示错误裁剪的图像。请在此处输入图像说明。 Shot 1。 Shot 2 Shot 3。我相信我的问题在于捕获的图像的大小,并且裁剪矩形的图像大小不相等,这就是它扭曲它的原因。
有谁知道原因可能是什么?很抱歉这个冗长的问题,但任何帮助将不胜感激!
以下是我的代码:
ViewController.swift:
class ViewController: UIViewController {
@IBOutlet var webView: UIWebView!
@IBOutlet var imageView: UIImageView!
override func viewDidLoad() {
imageView.isHidden = true
let aString = URL(string: "https://www.kshuntfishcamp.com/home.page")
webView.loadRequest(URLRequest(url: aString!))
super.viewDidLoad()
}
@IBAction func takePhotoPressed(_ sender: UIButton) {
UIGraphicsBeginImageContextWithOptions(webView.bounds.size, false, 0.0)
if let aContext = UIGraphicsGetCurrentContext(){
webView.layer.render(in: aContext)
}
let capturedImage:UIImage? = UIGraphicsGetImageFromCurrentImageContext()
imageView = UIImageView(frame: CGRect(x: 22, y: 123, width: 330, height: 330))
let image = capturedImage
imageView.image = image
imageView.contentMode = UIViewContentMode.scaleAspectFill
imageView.clipsToBounds = true
imageView.isHidden = true
webView.isHidden = true
let editView = EditImageView(frame: self.view.frame)
let image2 = capturedImage!
editView.initWithImage(image: image2)
let croppedImage = editView.getCroppedImage()
self.view.addSubview(editView)
self.view.backgroundColor = UIColor.clear
UIImageWriteToSavedPhotosAlbum(croppedImage, nil, nil, nil)
}
EditImageView.swift - source(https://github.com/Thanatos-L/LyEditImageView)-only包括与解决问题似乎相关的部分
func initWithImage(image:UIImage){
imageView = UIImageView(frame: CGRect(x: 22, y: 123, width: 330, height: 330))
imageView.tag = IMAGE_VIEW_TAG;
self.addSubview(self.imageView)
imageView.isUserInteractionEnabled = true;
imageView.image = image
imageView.frame = CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height)
let frame = AVMakeRect(aspectRatio: imageView.frame.size, insideRect: self.frame);
imageView.frame = frame
originImageViewFrame = frame
NSLog("initWithImage %@", NSStringFromCGRect(originImageViewFrame))
imageZoomScale = 1.0
commitInit()
}
private func cropImage() {
let rect = self.convert(cropView.frame, to: imageView)
let imageSize = imageView.image?.size
let ratio = originImageViewFrame.size.width / (imageSize?.width)!
let zoomedRect = CGRect(x: rect.origin.x / ratio, y: rect.origin.y / ratio, width: rect.size.width / ratio, height: rect.size.height / ratio)
let croppedImage = cropImage(image: imageView.image!, toRect: zoomedRect)
var view: UIImageView? = self.viewWithTag(1301) as? UIImageView
if view == nil {
view = UIImageView()
}
view?.frame = CGRect(x: 0, y: 0, width: croppedImage.size.width , height: croppedImage.size.height)
view?.image = croppedImage
view?.tag = 1301
self.addSubview(view!)
}