我正在将3D现实点(X,Y,Z)米转换为图像坐标(u,v)像素。我可以访问相机矩阵K,相机方向(以度为单位)和平移矢量。
我使用下面的代码来计算像素坐标。对于(X,Y,Z) = (0.04379281, 0.15902013, 0.73328906)
,我在尺寸为(u, v) = (-184.52735432, -249.19158505)
的图像上获得了352x287 px
。图像中像素坐标的原点是否还在左上角?如果是,则获得的像素位置在图像外部。是这样吗?
# Calculates Rotation Matrix given euler angles.
def eulerAnglesToRotationMatrix(theta):
R_x = np.array([[1, 0, 0],
[0, math.cos(theta[0]), -math.sin(theta[0])],
[0, math.sin(theta[0]), math.cos(theta[0])]
])
R_y = np.array([[math.cos(theta[1]), 0, math.sin(theta[1])],
[0, 1, 0],
[-math.sin(theta[1]), 0, math.cos(theta[1])]
])
R_z = np.array([[math.cos(theta[2]), -math.sin(theta[2]), 0],
[math.sin(theta[2]), math.cos(theta[2]), 0],
[0, 0, 1]
])
R = np.dot(R_z, np.dot(R_y, R_x))
return R
# x-roll, y-pitch, z-yaw (in degrees)
theta = [-128.41639709472657, -11.528900146484375, 53.37379837036133]
# rotation vector
rvec = eulerAnglesToRotationMatrix(theta)
# translation vector
tvec = np.array([[0.26409998536109927], [1.294700026512146],
[0.017799999564886094]], np.float)
# camera matrix
cameraMatrix = np.array([[148.130859375, 0, 142.5562286376953],
[0, 148.130859375, 179.04293823242188],
[0, 0, 1]], np.float)
# Point in (X,Y,Z) meters
objPts = np.array([[0.04379281, 0.15902013, 0.73328906]], np.float)
distCoeffs = np.array([0, 0, 0, 0, 0], np.float)
imgPts, _ = cv2.projectPoints(objPts, rvec, tvec, cameraMatrix, distCoeffs)