我试图在opencv中扭曲图像。起初使用针孔相机拍摄的图像是无效的,这要归功于cv::undistort(raw, undist, cameraMatrix, distCoeffs);
正在工作。现在,我正试图通过我在此处找到的补丁将undist
扭曲回原来的状态:http://code.opencv.org/issues/1387
但到目前为止,我还没有设法让它发挥作用。这是代码:
void distort(const cv::Mat& src, cv::Mat& dst, const cv::Mat& cameraMatrix, const cv::Mat& distCoeffs)
{
cv::Mat pixel_locations_src = cv::Mat(src.size(), CV_32FC2);
for (int i = 0; i < src.size().height; i++) {
for (int j = 0; j < src.size().width; j++) {
pixel_locations_src.at<cv::Point2f>(i,j) = cv::Point2f(j,i);
}
}
cv::Mat fractional_locations_dst = cv::Mat(src.size(), CV_32FC2);
cv::Mat pixel_locations_dst = cv::Mat(src.size(), CV_32FC2);
cv::undistortPoints(pixel_locations_src, pixel_locations_dst, cameraMatrix, distCoeffs);
const float fx = cameraMatrix.at<double>(0,0);
const float fy = cameraMatrix.at<double>(1,1);
const float cx = cameraMatrix.at<double>(0,2);
const float cy = cameraMatrix.at<double>(1,2);
// is there a faster way to do this?
for (int i = 0; i < fractional_locations_dst.size().height; i++) {
for (int j = 0; j < fractional_locations_dst.size().width; j++) {
const float x = fractional_locations_dst.at<cv::Point2f>(i,j).x*fx + cx;
const float y = fractional_locations_dst.at<cv::Point2f>(i,j).y*fy + cy;
pixel_locations_dst.at<cv::Point2f>(i,j) = cv::Point2f(x,y);
}
}
cv::remap(src, dst, pixel_locations_dst, cv::Mat(), CV_INTER_LINEAR);
}
我尝试将RGB图像传递给函数,但由于undistortPoints
采用1 * N,2个通道矩阵,代码将在undistortPoints
处触发断言我不明白为什么{ {1}}使用1xN矩阵作为输入。
关于这个话题的任何启示都会很棒。感谢
答案 0 :(得分:0)
我终于使用了不同的方法,我只需要扭曲一定的点,这里是代码:
void DistortPoints(const std::vector<cv::Point2f> & src, std::vector<cv::Point2f> & dst, const cv::Mat& cameraMatrix, const cv::Mat& distorsionMatrix)
{
double fx = cameraMatrix.at<double>(0,0);
double fy = cameraMatrix.at<double>(1,1);
double cx = cameraMatrix.at<double>(0,2);
double cy = cameraMatrix.at<double>(1,2);
std::vector<cv::Point3f> src2;
for (int i = 0; i < src.size(); i++)
src2.push_back(cv::Point3f((src[i].x - cx) / fx, (src[i].y - cy) / fy, 0));
cv::Mat rVec(3, 1, cv::DataType<double>::type, cv::Scalar(0)); // Rotation vector
cv::Mat tVec(3, 1, cv::DataType<double>::type, cv::Scalar(0)); // Translation vector
std::vector<cv::Point2f> dst2;
cv::projectPoints(src2, rVec, tVec, cameraMatrix, distorsionMatrix, dst);
}