使用Core Image在Swift中复制cv :: warpAffine

时间:2018-03-14 15:11:33

标签: ios swift opencv core-image

我试图在Swift和Core Image中完全复制一个面部对齐算法。但是,我已经在尝试从Swift中的opencv复制一个简单的warpAffine

Python代码:

print(M) #M is a matrix calculated using some face detection code
print("let transform = CGAffineTransform(a: {0[0][0]}, b: {0[1][0]}, c: {0[0][1]}, d: {0[1][1]}, tx: {0[0][2]}, ty: {0[1][2]})".format(M))
warped = cv2.warpAffine(img,M,(image_size[1],image_size[0]), borderValue = 0.0) #warped is the correctly aligned image, image_size is 112,112

我的快速版本:

import UIKit
import CoreImage

let ctx = CIContext()
let image = UIImage(named: "lg.jpg")
let ciimage = CIImage(image: image!)
let transform = CGAffineTransform(a: 0.390825773796, b: -0.0333640808264, c: 0.0333640808264, d: 0.390825773796, tx: -53.8777275485, ty: -23.0859985227)
let aligned = ciimage?.transformed(by: transform)
let size = aligned!.extent
let center = CGPoint(x: size.midX, y: size.midY)
let cropRect = CGRect(x: center.x-56, y: center.y-56, width: 112, height: 112)
let cropped = aligned!.cropped(to: cropRect)

swift版本(cropped)给出的图像向左旋转几度,裁剪距离太远。

我已经尝试重新排序变换参数,因为我猜我出错了。但根据opencv docsCGAffineTransform docs,参数应该是正确的。

1 个答案:

答案 0 :(得分:0)

很遗憾,我无法使用CoreImage复制它。我确实让它与UIImage很好地协同工作:

extension UIImage {
    func transformed(by transform: CGAffineTransform, size: CGSize) -> UIImage {
        UIGraphicsBeginImageContext(size)
        let c = UIGraphicsGetCurrentContext()
        let orig = CGRect(x: 0, y: 0, width: self.size.width, height: self.size.height)
        c?.concatenate(transform)
        self.draw(in: orig)
        let result = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()
        return result!
    }
}

虽然这确实有效,但从CIImage转换到UIImage并返回,对我的用例来说证明太慢了。