我使用OpenCv代码here获取了相机校准参数,如相机矩阵,失真系数,(旋转+平移)矢量和图像点。
通过对摄像机矩阵,旋转+平移向量和Matlab中的对象点坐标(X,Y,Z,1)进行硬编码,我无法获得与图像相同的坐标值点。我在这里错过了什么?我是否还需要考虑失真系数以获得精确或正确的图像点?
Matlab代码:
% Define all the parameters camera matrix , sample image point, object point, rotation and translation vectors%
cameraMatrix = [5.9354 0 3.1950; 0 5.9354 2.3950 ; 0 0 1]
rotationMatrix = [2.5233 1.6803 3.0728];
translationMatrix = [1.2682 1.9657 8.0141];
X = [0; 0; 0; 1];
rotationMatrix = transpose(rotationMatrix);
translationMatrix = transpose(translationMatrix);
%convert the rotation vector into rotation matrix using Rodrigues func.%
rotMat = rodrigues(rotationMatrix);
R_T = horzcat(rotMat, translationMatrix)
%Convert to 2D points%
imgPts = cameraMatrix * R_T * X
lastElement = imgPts(end);
ScreenImgPts = imgPts / lastElement
对象点由棋盘方形尺寸(30mm)的方形大小定义,即[0,0,0,1],[30,0,0,1]等。
然而,经过我的计算和比较存储在xml文件中的图像点是不一样的。我的结果如下
第一点,第二点和第三点的输出(图像点)应为:
所有参数的输出文件都是here
答案 0 :(得分:2)
问题非常简单,我将所有重要参数(相机矩阵,旋转+平移)的值四舍五入到第4位,而在值的末尾有一个明确存在的指数(e)。因此四舍五入导致了不正确的值。
以下是具有更正值的代码
% Define all the parameters camera matrix , sample image point, object point, rotation and translation vectors%
cameraMatrix = [5.9354136482375827e+002 0. 3.1950000000000000e+002; 0. 5.9354136482375827e+002 2.3950000000000000e+002 ; 0 0 1;]
%Rotatoin and translation vector of different planes (snapshot)%
rotationVector = [2.5233190617669338e-001 1.6802568443347082e-001 3.0727563215131681e+000];
translationVector = [1.2682348793063555e+002 1.9656574525587070e+002 8.0141048598043449e+002];
% rotationVector = [2.3492892819146791e-001 1.6451261910667694e-001 3.0787833660290516e+000];
% translationVector = [1.2806533156889765e+002 1.9877886039281353e+002 8.0447195879431570e+002];
% rotationVector = [2.1721 1.6300 3.0619];
% translationVector = [1.2661 1.9511 8.0681];
distCoeffs = [1.0829115704079707e-001 -1.0278232972256371e+000 0 0 1.7962320082487011e+000]; % k1, k2, p1, p2, k3 %
k1 = distCoeffs(1);
k2 = distCoeffs(2);
p1 = 0;
p2 = 0;
k3 = distCoeffs(end);
% X = [0 0 0; 30 0 0]
rotationVector = transpose(rotationVector);
translationVector = transpose(translationVector);
%convert the rotation vector into rotation matrix using Rodrigues func.%
rotMat = rodrigues(rotationVector)
R_T = horzcat(rotMat, translationVector)
%Convert to 2D points%
% imgPts = cameraMatrix * R_T * X
%
% lastElement = imgPts(end)
%
% ScreenImgPts = imgPts / lastElement
%%%%%%%%%%%%%% Adding calculation for distortion parameters%%%%%%%%%%
objectPoints = [0 0 0; 30 0 0; 60 0 0; 90 0 0; 120 0 0; 150 0 0;180 0 0;
0 30 0; 30 30 0; 60 30 0; 90 30 0; 120 30 0; 150 30 0; 180 30 0;
0 60 0; 30 30 0; 60 60 0; 90 60 0; 120 60 0; 150 60 0; 180 60 0;
0 90 0; 30 30 0; 60 90 0; 90 90 0; 120 90 0; 150 90 0; 180 90 0;
0 120 0; 30 120 0; 60 120 0; 90 120 0; 120 120 0; 150 120 0; 180 120 0;]
Xelement = [];
screenCoords = [];
NormXY = [];
for i = 1:35
Xelement = (objectPoints(i,:))
NormXY(:,1) = (rotMat * transpose(Xelement)) + translationVector
lastElement = NormXY(end)
NormXY = NormXY / lastElement
x = NormXY(1)
y = NormXY(2)
r2 = power(x,2) + power(y,2)
r4 = power(r2,2)
r6 = power(r2,3)
xcorr = x * (1 + k1*r2 + k2*r4 + k3*r6)
ycorr = y * (1 + k1*r2 + k2*r4 + k3*r6)
XY = [xcorr ;ycorr;1]
screenCoords(:,i) = cameraMatrix * XY
end
答案 1 :(得分:1)
是的,你需要考虑失真。此外,如果您在MATLAB中工作,可能更容易使用内置the Camera Calibrator App的MATLAB。