3D重建算法实现不起作用

时间:2018-11-17 12:42:32

标签: matlab computer-vision projection 3d-reconstruction

我正在尝试实现一种从2D图片坐标重建具有已知Z分量的3D坐标的算法。

使用matlab的Camera Calibration应用程序,计算了固有参数。我使用图片中我知道的真实世界坐标的那四个点来获取外部参数。

然后我将original stack overflow post中提出的方法用于3d重建:

  

在拥有所有矩阵之后,该方程式可以帮助我   将图像点转换为世界坐标:

     

pinhole model equation

     

其中M是cameraMatrix,R-rotationMatrix,t-tvec,而s是   未知。 Zconst表示橙色球所在的高度,以   这个例子是285毫米。所以,首先我需要解决   等式,得到“ s”,然后我可以通过   选择图像点:

     

rearranged pinhole model equation

%% Intrinsic camera parameters
cameraData = load("CameraParams.mat");
cameraParams = cameraData.cameraParams;
intrinsicMatrix = cameraParams.IntrinsicMatrix';

%% Points with known world coordinates
imagePoints = [224 92; 963 81; 200 653; 988 650];
worldPoints = [0   0;  114 0;  0   85;  114 85];

%% Points with unknown world xy-coordinates
unknownPoints = [416 280; 773 275; 414 479; 778 477];
zOffset = 0;

%% Extrinsic camera parameters
[rotationMatrix, translationVector] = extrinsics(imagePoints, worldPoints, cameraParams);

%% Transform image to world coordinates
results = zeros(length(unknownPoints), 3);
A = inv(intrinsicMatrix * rotationMatrix);
for i = 1:length(unknownPoints)
    P = [unknownPoints(i,:) 1]';
    AP = A * P;
    At = A * translationVector';
    s = (zOffset + At(3)) / AP(3);

    results(i,:) = s * AP - At;
end

%% Visualization
allWorldPoints = [[worldPoints zeros(size(worldPoints, 1), 1)]; results];
allImagePoints = [imagePoints; unknownPoints];
[orientation, location] = extrinsicsToCameraPose(rotationMatrix, translationVector);

figure;
scatter(allImagePoints(:,1), allImagePoints(:,2));

figure;
plotCamera("Location", location, "Orientation", orientation, "Size", 20);
hold on;
pcshow(allWorldPoints, [0 0 0], "VerticalAxisDir", "down", "MarkerSize", 40);

原始图片如下(y方向被翻转)original picture

重建看起来像这样: 3d reconstruction

如您所见,用于计算外部参数的4个固定点正确显示,但是重建的固定点位于错误的位置。

是什么导致此错误,我该如何解决?

0 个答案:

没有答案