立体摄像机的独立旋转

时间:2015-08-11 18:00:23

标签: camera computer-vision 3d-reconstruction stereoscopy

我有两个相机指向同一个场景。当它们彼此平行时,我可以从真实位置转换到每个屏幕坐标,从两个屏幕坐标转换为真实位置。

enter image description here

从实际位置到每个屏幕坐标(焦点f已知):

xl = XL / Z * f
yl = 0
xr = XR / Z * f
yr = 0

从两个屏幕坐标到实际位置:

XL + XR = D
xl / f = XL / Z
xr / f = XR / Z

Z = f * D / (xl + xr)
XL = xl / f * Z
YL = yl / f * Z

相机现在具有两个独立的三轴旋转(α,β,ζ)和(α',β',ζ')。这实际上是他们的偏航,俯仰和滚动。相机首先沿y轴旋转α,然后沿着新的x轴旋转β,最后沿着新的z轴旋转ζ。

我仍然可以通过旋转实际位置并应用与上述情况相同的公式,将真实位置转换为每个屏幕坐标:

(AL, BL, CL) = rot33_axis3(ζ) * rot33_axis1(β) * rot33_axis2(α) * (XL, YL, Z)
(AR, BR, CR) = rot33_axis3(ζ') * rot33_axis1(β') * rot33_axis2(α') * (XR, YR, Z)
xl = AL / CL * f
yl = BL / CL * f
xr = AR / CR * f
yr = BR / CR * f

我已经过测试,计算出的坐标与屏幕匹配。

我现在的问题是从2个屏幕坐标计算实际位置。我在做:

(al, bl, cl) = rot33_axis2(-α) * rot33_axis1(-β) * rot33_axis3(-ζ) * (xl, yl, f)
(ar, br, cr) = rot33_axis2(-α') * rot33_axis1(-β') * rot33_axis3(-ζ') * (xr, yr, f)

XL + XR = D
al / f = XL / Z
ar / f = XR / Z

Z = f * D / (al + ar)
XL = al / f * Z
YL = bl / f * Z

不幸的是,这不起作用。

我的想法是取屏幕坐标,将z值指定给焦点,以负角度以相反的顺序应用旋转矩阵(此时,屏幕已“向后”旋转到平行于连接线的平面两个摄像头)并应用与第一种情况相同的公式。

我做错了什么?从(xl,yl,f)开始是错误的吗?

编辑1:

基于aledalgrande anwer,这是一些opencv代码:

//Image is 640x360, focal is 0.42
Matx33d camMat = Matx33d(
0.42f * 640.0f, 0.0f, 320.0f,
0.0f, 0.42f * 360.0f, 180.0f,
0.0f, 0.0f, 1.0f);
Matx41d distCoeffs = Matx41d(0.0f, 0.0f, 0.0f, 0.0f);

Matx31d rvec0, tvec0, rvec1, tvec1;

solvePnP(objPoints, imgPoints0, camMat, distCoeffs, rvec0, tvec0);
solvePnP(objPoints, imgPoints1, camMat, distCoeffs, rvec1, tvec1);
//Results make sense if I use projectPoints

Matx33d rot0;
Rodrigues(rvec0, rot0);
Matx34d P0 = Matx34d(
rot0(0, 0), rot0(0, 1), rot0(0, 2), tvec0(0, 0),
rot0(1, 0), rot0(1, 1), rot0(1, 2), tvec0(0, 1),
rot0(2, 0), rot0(2, 1), rot0(2, 2), tvec0(0, 2));

Matx33d rot1;
Rodrigues(rvec1, rot1);
Matx34d P1 = Matx34d(
rot1(0, 0), rot1(0, 1), rot1(0, 2), tvec1(0, 0),
rot1(1, 0), rot1(1, 1), rot1(1, 2), tvec1(0, 1),
rot1(2, 0), rot1(2, 1), rot1(2, 2), tvec1(0, 2));

Point u0_(353, 156);
Point u1_(331, 94);

Matx33d camMatInv = camMat.inv();
u0.x = u0_.x * camMatInv(0, 0) + u0_.y * camMatInv(0, 1) + 1.0f * camMatInv(0, 2);
u0.y = u0_.y * camMatInv(1, 0) + u0_.y * camMatInv(1, 1) + 1.0f * camMatInv(1, 2);
u1.x = u1_.x * camMatInv(0, 0) + u1_.y * camMatInv(0, 1) + 1.0f * camMatInv(0, 2);
u1.y = u1_.y * camMatInv(1, 0) + u1_.y * camMatInv(1, 1) + 1.0f * camMatInv(1, 2);

Matx14d A1(u0.x * P0(2, 0) - P0(0, 0), u0.x * P0(2, 1) - P0(0, 1), u0.x * P0(2, 2) - P0(0, 2), u0.x * P0(2, 3) - P0(0, 3));
Matx14d A2(u0.y * P0(2, 0) - P0(1, 0), u0.y * P0(2, 1) - P0(1, 1), u0.y * P0(2, 2) - P0(1, 2), u0.y * P0(2, 3) - P0(1, 3));
Matx14d A3(u1.x * P1(2, 0) - P1(0, 0), u1.x * P1(2, 1) - P1(0, 1), u1.x * P1(2, 2) - P1(0, 2), u1.x * P1(2, 3) - P1(0, 3));
Matx14d A4(u1.y * P1(2, 0) - P1(1, 0), u1.y * P1(2, 1) - P1(1, 1), u1.y * P1(2, 2) - P1(1, 2), u1.y * P1(2, 3) - P1(1, 3));

double normA1 = norm(A1), normA2 = norm(A2), normA3 = norm(A3), normA4 = norm(A4);

Matx44d A(
A1(0) / normA1, A1(1) / normA1, A1(2) / normA1, A1(3) / normA1,
A2(0) / normA2, A2(1) / normA2, A2(2) / normA2, A2(3) / normA2,
A3(0) / normA3, A3(1) / normA3, A3(2) / normA3, A3(3) / normA3,
A4(0) / normA4, A4(1) / normA4, A4(2) / normA4, A4(3) / normA4);

SVD svd;
Matx41d u;
svd.solveZ(A, u);

1 个答案:

答案 0 :(得分:2)

如果旋转相机(general case),则不能使用简化的公式进行三角测量。如果您想获得更准确的结果,则必须使用linear triangulation(或other methods。)

// points u0 and u1, projection matrices firstP and secondP

// "Multiple View Geometry in Computer Vision" 12.2 and 4.1.1
cv::Matx14d A1 = u0(0) * firstP.row(2) - firstP.row(0);
cv::Matx14d A2 = u0(1) * firstP.row(2) - firstP.row(1);
cv::Matx14d A3 = u1(0) * secondP.row(2) - secondP.row(0);
cv::Matx14d A4 = u1(1) * secondP.row(2) - secondP.row(1);

double normA1 = cv::norm(A1), normA2 = cv::norm(A2), normA3 = cv::norm(A3), normA4 = cv::norm(A4);

cv::Matx44d A(A1(0) / normA1, A1(1) / normA1, A1(2) / normA1, A1(3) / normA1,
              A2(0) / normA2, A2(1) / normA2, A2(2) / normA2, A2(3) / normA2,
              A3(0) / normA3, A3(1) / normA3, A3(2) / normA3, A3(3) / normA3,
              A4(0) / normA4, A4(1) / normA4, A4(2) / normA4, A4(3) / normA4);

cv::SVD svd;
cv::Matx41d pointHomogeneous;
svd.solveZ(A, pointHomogeneous);