iOS OpenCV warpPerspective错误结果

时间:2019-06-20 14:21:34

标签: objective-c swift opencv

我正在尝试实施裁剪和透视校正。我正在使用IOS(Swift)进行应用程序开发,并使用OpenCV进行裁剪和透视图转换。

我已经进行了边缘检测以为我检测轮廓点。因此,基本上,四个点组成一个矩形。

但是,当我将点发送到OpenCV时,会得到以下变形的图像。或将以下变形的图像返回给我。

下面是我得到的结果。

这是原始图像: enter image here

在这种情况下,重点是极端的:

enter image description here

在这种情况下,我从上方拍摄了图像:

enter image description here

我关注了以下文章,但没有运气。

Perspective Transform + Crop in iOS with OpenCV

我在图像方向和点排序方面做错了。

我正在将OpenCV4与Swift配合使用,尝试去歪斜,对齐并可能裁剪从设备相机拍摄的图像。

我允许用户将屏幕上的点拖到图像上的四个角。在用户选择了点之后,我将它们发送给OpenCV以尝试使其倾斜。我目前正在使用cv :: warpPerspective来实现这一目标。

我搜索并尝试了多个示例,但似乎找不到我正在做的事情的问题。

+(UIImage*)confirmedImage
:(UIImage *)sourceImage
:(CGFloat)contentScale
:(CGPoint)point1
:(CGPoint)point2
:(CGPoint)point3
:(CGPoint)point4
{

    vector<Point2f> pointsToSort;
    pointsToSort.push_back(Point2f(point1.x, point1.y));
    pointsToSort.push_back(Point2f(point2.x, point2.y));
    pointsToSort.push_back(Point2f(point3.x, point3.y));
    pointsToSort.push_back(Point2f(point4.x, point4.y));

    NSLog(@"\npoints before sorting in objective c");

    vector<Point2f> sortedPoints = [self sortCorners:pointsToSort];
    CGPoint p1 = CGPointMake(CGFloat(sortedPoints[0].x), CGFloat(sortedPoints[0].y));
    CGPoint p2 = CGPointMake(CGFloat(sortedPoints[1].x), CGFloat(sortedPoints[1].y));
    CGPoint p3 = CGPointMake(CGFloat(sortedPoints[2].x), CGFloat(sortedPoints[2].y));
    CGPoint p4 = CGPointMake(CGFloat(sortedPoints[3].x), CGFloat(sortedPoints[3].y));

    NSLog(@"\npoint1.x %f", p1.x);
    NSLog(@"point1.y %f", p1.y);
    NSLog(@"\npoint1.x %f", p2.x);
    NSLog(@"point1.y %f", p2.y);
    NSLog(@"\npoint1.x %f", p3.x);
    NSLog(@"point1.y %f", p3.y);
    NSLog(@"\npoint1.x %f", p4.x);
    NSLog(@"point1.y %f", p4.y);

    NSLog(@"\npoints after sorting in objective c");

    CGFloat scaleFactor = contentScale;

    NSLog(@"\n scaleFactor: %f", scaleFactor);

    CGPoint ptBottomLeft = CGPointMake(CGFloat(p1.x / scaleFactor), CGFloat(p1.y / scaleFactor));
    CGPoint ptBottomRight =  CGPointMake(CGFloat(p2.x / scaleFactor), CGFloat(p2.y / scaleFactor));
    CGPoint ptTopRight =  CGPointMake(CGFloat(p3.x / scaleFactor), CGFloat(p3.y / scaleFactor));
    CGPoint ptTopLeft =  CGPointMake(CGFloat(p4.x / scaleFactor), CGFloat(p4.y / scaleFactor));

    CGFloat w1 = sqrt( pow(ptBottomRight.x - ptBottomLeft.x , 2) + pow(ptBottomRight.x - ptBottomLeft.x, 2));
    CGFloat w2 = sqrt( pow(ptTopRight.x - ptTopLeft.x , 2) + pow(ptTopRight.x - ptTopLeft.x, 2));

    CGFloat h1 = sqrt( pow(ptTopRight.y - ptBottomRight.y , 2) + pow(ptTopRight.y - ptBottomRight.y, 2));
    CGFloat h2 = sqrt( pow(ptTopLeft.y - ptBottomLeft.y , 2) + pow(ptTopLeft.y - ptBottomLeft.y, 2));

    NSLog(@"\n w1 : %f", w1);
    NSLog(@"\n w2 : %f", w2);
    NSLog(@"\n h1 : %f", h1);
    NSLog(@"\n h2 : %f", h2);

    CGFloat maxWidth = (w1 < w2) ? w1 : w2;
    CGFloat maxHeight = (h1 < h2) ? h1 : h2;

    NSLog(@"\n maxWidth : %f", maxWidth);
    NSLog(@"\n maxHeight : %f", maxHeight);
    cv::Point2f src[4], dst[4];
    src[0].x = ptTopLeft.x;
    src[0].y = ptTopLeft.y;
    src[1].x = ptTopRight.x;
    src[1].y = ptTopRight.y;
    src[2].x = ptBottomLeft.x;
    src[2].y = ptBottomLeft.y;
    src[3].x = ptBottomRight.x;
    src[3].y = ptBottomRight.y;

    dst[0].x = 0;
    dst[0].y = 0;
    dst[1].x = maxWidth - 1;
    dst[1].y = 0;
    dst[2].x = 0;
    dst[2].y = maxHeight - 1;
    dst[3].x = maxWidth - 1;
    dst[3].y = maxHeight - 1;

//    dst[0].x = 0;
//    dst[0].y = 0;
//    dst[1].x = 0;
//    dst[1].y = maxHeight - 1;
//    dst[2].x = maxWidth - 1;
//    dst[2].y = maxHeight - 1;
//    dst[3].x = maxWidth - 1;
//    dst[3].y = 0;

    //ORIG
//    dst[0].x = 0;
//    dst[0].y = 0;
//    dst[1].x = maxWidth - 1;
//    dst[1].y = 0;
//    dst[2].x = maxWidth - 1;
//    dst[2].y = maxHeight - 1;
//    dst[3].x = 0;
//    dst[3].y = maxHeight - 1;

    cv::Size matSize = cv::Size(maxWidth, maxHeight);

    cv::Mat undistorted = cv::Mat(matSize, CV_8UC1);
    cv::Mat original = [OpenCVWrapperInternal cvMatFromUIImage:sourceImage];
    cv::warpPerspective(original, undistorted, cv::getPerspectiveTransform(src, dst), matSize);
    original.release();

    // Rotate
    cv::Mat rotated;
    rotate(undistorted, rotated, 2);
    // Flip
//    cv:: Mat flipped;
//    cv::flip(rotated, flipped, 1);


    UIImage *newImage = [OpenCVWrapperInternal UIImageFromCVMat:undistorted :sourceImage];

    undistorted.release();
//    flipped.release();
    rotated.release();

    return newImage;  
}

我希望在拍摄图像时不受方向或键石的影响,以使其变斜,歪斜并最终正确显示最终图像。

0 个答案:

没有答案