我正在尝试将给定的3D点投影到图像平面上,我对此发布了很多问题,很多人为我提供帮助,我也阅读了许多相关链接,但投影仍然无法正确地为我工作。
我有一个3d点(-455,-150,0),其中x是深度轴,z是向上轴,y是水平,我有一个滚动:围绕前后轴(x)旋转,间距:围绕左右轴(y)旋转和 yaw:围绕垂直轴(z)旋转,我在相机上的位置也有(x,y,z)=(-50,0,100),所以我正在执行以下操作 首先,我使用外部参数从世界坐标到摄像机坐标:
double pi = 3.14159265358979323846;
double yp = 0.033716827630996704* pi / 180; //roll
double thet = 67.362312316894531* pi / 180; //pitch
double k = 89.7135009765625* pi / 180; //yaw
double rotxm[9] = { 1,0,0,0,cos(yp),-sin(yp),0,sin(yp),cos(yp) };
double rotym[9] = { cos(thet),0,sin(thet),0,1,0,-sin(thet),0,cos(thet) };
double rotzm[9] = { cos(k),-sin(k),0,sin(k),cos(k),0,0,0,1};
cv::Mat rotx = Mat{ 3,3,CV_64F,rotxm };
cv::Mat roty = Mat{ 3,3,CV_64F,rotym };
cv::Mat rotz = Mat{ 3,3,CV_64F,rotzm };
cv::Mat rotationm = rotz * roty * rotx; //rotation matrix
cv::Mat mpoint3(1, 3, CV_64F, { -455,-150,0 }); //the 3D point location
mpoint3 = mpoint3 * rotationm; //rotation
cv::Mat position(1, 3, CV_64F, {-50,0,100}); //the camera position
mpoint3=mpoint3 - position; //translation
现在我想从相机坐标移到图像坐标
第一个解决方案是:据我从某些来源了解到
Mat myimagepoint3 = mpoint3 * mycameraMatrix;
这不起作用
第二个解决方案是:
double fx = cameraMatrix.at<double>(0, 0);
double fy = cameraMatrix.at<double>(1, 1);
double cx1 = cameraMatrix.at<double>(0, 2);
double cy1= cameraMatrix.at<double>(1, 2);
xt = mpoint3 .at<double>(0) / mpoint3.at<double>(2);
yt = mpoint3 .at<double>(1) / mpoint3.at<double>(2);
double u = xt * fx + cx1;
double v = yt * fy + cy1;
但也不起作用
我还尝试使用opencv方法fisheye :: projectpoints(从世界到图像坐标)
Mat recv2;
cv::Rodrigues(rotationm, recv2);
//inputpoints a vector contains one point which is the 3d world coordinate of the point
//outputpoints a vector to store the output point
cv::fisheye::projectPoints(inputpoints,outputpoints,recv2,position,mycameraMatrix,mydiscoff );
但这也不起作用
不起作用,我的意思是:我知道(在图像中)该点应该出现在哪里,但是当我绘制它时,它总是在另一个地方(甚至不靠近),有时我甚至得到一个负值
注意:没有语法错误或异常,但是我在此处编写代码时可以打错吗? 所以有人可以建议我做错了什么吗?