我从 sourceImage 中提取了四个角:
src_vertices[0] = corners[upperLeft];
src_vertices[1] = corners[upperRight];
src_vertices[2] = corners[downLeft];
src_vertices[3] = corners[downRight];
这四个角被扭曲为 destinationImage ,如:
dst_vertices[0] = Point(0,0);
dst_vertices[1] = Point(width, 0);
dst_vertices[2] = Point(0, height);
dst_vertices[3] = Point(width, height);
Mat warpPerspectiveMatrix = getPerspectiveTransform(src_vertices, dst_vertices);
cv::Size size_d = Size(width, height);
cv::Mat DestinationImage(width,height,CV_8UC3);
warpPerspective(sourceImage, destinationImage, warpPerspectiveMatrix, size_d, INTER_LINEAR, BORDER_CONSTANT);
现在我的问题是:
我有从 destinationImage 取得的点p(x,y)如何在原始sourceImage中检索此点的坐标
换句话说,我想使用warpPerspectiveMatrix来完成getPerspectiveTransform的相反工作
答案 0 :(得分:5)
您想要逆透视变换。如果您的原始变换是S-> S',则需要变换矩阵S' - > S
Mat InversewarpPerspectiveMatrix = getPerspectiveTransform(dst_vertices, src_vertices);
然后你制作一个SPARSE矩阵
Mat PerspectiveCoordinates containing the vector x,y.
最后你想打电话
PerspectiveTransform(PerspectiveCoordinates,OriginalCoordinates,InversewarpPerspectiveMatrix)