详情:
我在ROS Kinetic的OpenCV3中使用solvePnP()来通过led标记来估计我的机器人的姿势
相机调整为128 * 128
当我在opencv中运行我的代码时,它总会返回荒谬的结果
似乎我不理解PNP,但MATLAB得到了更合理的结果
在我阅读OpenCV和Matlab文档后,我仍然感到困惑。
我不知道哪里出错了。
这是c ++代码:
vector<Point2f> p2d;
p2d.push_back(Point2f(65,55));
p2d.push_back(Point2f(91,57));
p2d.push_back(Point2f(63,83));
p2d.push_back(Point2f(90,85));
vector<Point3f> p3d;
p3d.push_back(Point3f(0.687076330185,0.966806054115,1.80424296856));//topleft
p3d.push_back(Point3f(-0.18462985754,0.96837836504,1.79057383537));//topright
p3d.push_back(Point3f(0.691743910313,0.056151561439,1.73733282089));//bottonleft
p3d.push_back(Point3f(-0.179948031902,0.0538983009756,1.74941062927));//bottonright
Mat cameraMatrix = Mat::zeros(3,3,CV_64FC1);//fx,0,u0;0,fy,v0;0,0,1
cameraMatrix.at<double>(0,0) = 1.138594e+02;
//cameraMatrix.at<double>(0,1) = -0.0797;
cameraMatrix.at<double>(1,1) = 1.139064e+02;
cameraMatrix.at<double>(0,2) = 7.63093e+01;
cameraMatrix.at<double>(1,2) = 6.68583e+01;
cameraMatrix.at<double>(2,2) = 1;
Mat distortion = Mat::zeros(5,1,CV_64FC1);//k1,k2,p1,p2,k3
distortion.at<double>(0,0) = -2.837e-01;
distortion.at<double>(1,0) = 0.0882e+00;
distortion.at<double>(2,0) = -0.001774514973451;
distortion.at<double>(3,0) = 6.2359e-05;
distortion.at<double>(4,0) = 7.973409664780424e-04;
Mat rvec = Mat::zeros(3,1,CV_64FC1);
Mat tvec = Mat::zeros(3,1,CV_64FC1);
solvePnP(Mat(p3d),Mat(p2d),cameraMatrix,distortion,rvec,tvec,false,CV_ITERATIVE);
CV_ITERATIVE函数的结果如下所示:
pose:
position:
x: 0.498402438763
y: 0.409237140455
z: 1.87770535106
orientation:
x: -0.0660728111319
y: 0.0615960086814
z: 0.995333090477
w: -0.0339463450942
CV_EPNP功能的结果如下所示:
pose:
position:
x: 0.0447529023645
y: 0.640655332601
z: 1.92705476321
orientation:
x: 0.0605125209219
y: -0.00392024438163
z: 0.997734267547
w: -0.0291409996095
CV_P3P功能的结果如下所示:
pose:
position:
x: 0.609777855777
y: 0.962804613723
z: 2.20123033932
orientation:
x: -0.114062563556
y: -0.0956252432628
z: 0.988152306053
w: -0.0374241163864
这是Matlab代码:
tangentialDistortion = [-0.001774514973451,6.230176254781805e-05];
radialDistortion = [-0.283675926885368,0.088238919987596,7.973409664780424e-04];
IntrinsicMatrix = [113.859398285567 0 0
-0.0797237890933342 113.906402839770 0
76.3090526736352 66.8576711008518 1];
cameraParams = cameraParameters('IntrinsicMatrix',IntrinsicMatrix,'RadialDistortion',radialDistortion,'TangentialDistortion',tangentialDistortion);
imagePoints = [65,55;91,57;63,83;90,85];
[undistortedPoints, reprojectionErrors] = undistortPoints(imagePoints,cameraParams);
worldPoints = [0.687080681324,0.966781198978,1.80424845219;
-0.18462985754,0.96837836504,1.79057383537;
0.691743910313,0.056151561439,1.73733282089;
-0.179948031902,0.0538983009756,1.74941062927
];
[worldOrientation,worldLocation] = estimateWorldCameraPose(undistortedPoints,worldPoints,cameraParams);
Matlab(P3P)函数的结果如下所示:
orientation:
-0.990479259493302 0.0577122306046552 -0.124980538293912
-0.0690583764694848 -0.993684823835936 0.0884387444367233
-0.119087266964189 0.0962276951650630 0.988209721430740
location:
0.734846969941507 0.244533065842483 -1.87846167697158
相机架的地面位置:
//ABC are 3 leds on camera holder
0.513234317303 ,0.13388569653 ,-1.93677461147 //A
0.199585437775 ,0.0487480945885 ,-1.94162213802 //B
0.0308324769139,0.00147355603985,-1.9436986446 //C
0.2478840773 ,0.0613691157, -1.9406984647 //average
有人可以帮助我吗?
这个问题困扰了我很长时间,我找不到任何相关的答案 谢谢你的时间!