OpenCV undistortPoints()不起作用

时间:2017-03-28 22:15:59

标签: c++ opencv image-processing

我的目标是将鱼眼图像映射到网格(作为纹理),然后扭曲该网格,使其将图像转换为直线图像。

通过使用OpenCV,我计算了相机的鱼眼校准值。现在,我陷入了使用校准值来转换网格的问题。

似乎cv :: fisheye :: undistortPoints()设计用于执行我所描述的操作,但我无法使其正常工作。我的代码(包含在下面)产生的结果明显不正确。为了测试,我使用0.0表示所有失真系数,这意味着直线应该转换为直线,但点[0,0],[1,0]和[2,0]不映射到直线,而不是它们变为[-4.4789,-4.4789],[0,-1.55741]和[2,0] - > [4.4789,-4.4789]。

我在某处犯了错误,或者我对undistortPoints()的理解不正确?

完整输出:

Camera Matrix: [3 x 3] (CV_64FC1)
[1, 0, 1;
 0, 1, 1;
 0, 0, 1]

Distortion Coefficients: [4 x 1] (CV_64FC1)
[0, 0, 0, 0]

Transformation done by fisheye::undistortPoints():
[0, 0] -> [-4.4789, -4.4789]
[1, 0] -> [0, -1.55741]
[2, 0] -> [4.4789, -4.4789]
[0, 1] -> [-1.55741, 0]
[1, 1] -> [0, 0]
[2, 1] -> [1.55741, 0]
[0, 2] -> [-4.4789, 4.4789]
[1, 2] -> [0, 1.55741]
[2, 2] -> [4.4789, 4.4789]

Main.cpp的

#include <iostream>
#include <opencv2/core.hpp>
#include <opencv2/calib3d.hpp>

#define WIDTH 2.0
#define HEIGHT 2.0
#define CENTER_X 1.0
#define CENTER_Y 1.0
#define FOCAL_X 1.0
#define FOCAL_Y 1.0
#define K1 0.0
#define K2 0.0
#define K3 0.0
#define K4 0.0

using namespace cv;
using namespace std;

int main();

struct PointsCalibration {

    PointsCalibration(int width, int height, double centerX, double centerY, double focalX, double focalY, double  k1, double k2, double k3, double k4){
        vector<cv::Point2d> inputDistortedPoints;
        vector<cv::Point2d> outputUndistortedPoints;

        Size* size = new Size(width, height);

        Mat cameraMatrix = (Mat1d(3,3) <<
                focalX, 0.0,    centerX,
                0.0,    focalY, centerY,
                0.0,    0.0,    1.0
        );

        Mat distCoeffs = (Mat1d(1, 4) << k1, k2, k3, k4);

        printMatrix("Camera Matrix", cameraMatrix);
        printMatrix("Distortion Coefficients", distCoeffs);

        for(int col = 0; col <= size->width; col++){
            for(int row = 0; row <=  size->height; row++){

                double x = (double) (row);
                double y = (double) (col);

                Point2d point(x, y);
                inputDistortedPoints.push_back(point);
            }
        }

        cv::fisheye::undistortPoints(inputDistortedPoints, outputUndistortedPoints, cameraMatrix, distCoeffs);

        printTransformation("undistortPoints Transformation", inputDistortedPoints, outputUndistortedPoints);

    }
};


int main() {

    //create an instance based on the hard-coded calibration values, for testing
    PointsCalibration calib(WIDTH, HEIGHT, CENTER_X, CENTER_Y, FOCAL_X, FOCAL_Y, K1, K2, K3, K4);

}

1 个答案:

答案 0 :(得分:0)

我的问题的原因是我试图取消超出鱼眼失真范围的点(黑色图像区域)。

解决此问题的一种方法是从未失真位置的点开始,然后使用distortPoints()函数查找其失真位置。然后,使用扭曲的点作为起点。

另一种解决方法是识别鱼眼图像的黑色区域,而不是将任何点放在其外面。