我正在研究MATLAB中的视觉测距代码。我使用以下示例(estimateEssentialMatrix)来获取基本矩阵。您可以打开此示例输入:
openExample('vision/EstimateEssentialMatrixFromAPairOfImagesExample')
到命令窗口。然后,我用了
[relativeOrientation,relativeLocation] = relativeCameraPose(E,cameraParams,inlierPoints1,inlierPoints2);
[rotationMatrix,translationVector] = cameraPoseToExtrinsics(relativeOrientation,relativeLocation);
恢复旋转矩阵和平移向量。然后,我连接并绘制了平移向量(表示摄像机的位置),
T_t = T_t + R_t * translationVector';
R_t = R_t * rotationMatrix';
location = vertcat(location,[T_t(1),T_t(3)]);
plot3(location(:,1),zeros(size(location(:,1),1),1), location(:,2))
最初,
R_t = eye(3);
T_t = [0;0;0];
location = [0,0];
但是,我没有得到正确的结果。我认为问题在于包含相机参数的cameraParams对象。
我使用了与数据集一起提供的函数(ReadCameraModel
)来获取相机内在函数和非失真LUT。函数的i / o如下:
% ReadCameraModel - load camera intrisics and undistortion LUT from disk
%
% [fx, fy, cx, cy, G_camera_image, LUT] = ReadCameraModel(image_dir, models_dir)
%
% INPUTS:
% image_dir: directory containing images for which camera model is required
% models_dir: directory containing camera models
%
% OUTPUTS:
% fx: horizontal focal length in pixels
% fy: vertical focal length in pixels
% cx: horizontal principal point in pixels
% cy: vertical principal point in pixels
% G_camera_image: transform that maps from image coordinates to the base
% frame of the camera. For monocular cameras, this is
% simply a rotation. For stereo camera, this is a rotation % and a translation to the left-most
% lense.
% LUT: undistortion lookup table. For an image of size w x h, LUT will be an
% array of size [w x h, 2], with a (u,v) pair for each pixel. Maps pixels
% in the undistorted image to pixels in the distorted image
我使用以下代码获取cameraParams对象:
[fx, fy, cx, cy, G_camera_image, LUT] = ReadCameraModel('./stereo/centre','./model');
K = [fx 0 cx;0 fy cy;0 0 1]; %Intrinsic Matrix of the camera
cameraParams = cameraParameters('IntrinsicMatrix',K);
这是获取cameraParams对象的正确方法吗?如果是的话,我做错了什么?
答案 0 :(得分:0)
更改
match foo_ref.get_mut() {
Foo::Bar(_) => false,
Foo::Baz(_) => true
}
使用
if (empty(msg.value))