使用空中元数据获取鸟的鸟瞰图像

时间:2015-04-09 15:51:53

标签: opencv camera homography calibration orthographic

我从无人机获取元数据以及来自红外摄像机的图像。元数据包含俯仰,滚转,偏航旋转参数和无人机的GPS位置和高度。由于无人机旋转,相机不会持续观察到最低点。

我的目标是使用元数据获取正射校正的图像并近似地面的平面。

我遇到了一个前一个线程(Opencv virtually camera rotating/translating for bird's eye view),该线程解释了许多关于图像变换的灰色区域。然而,我仍然面临着关于将图像正投影到地面的一些问题。

我正在使用Opencv的功能s * m' = A [R | t] M'为了从图像坐标到世界坐标。矩阵A,R,t是已知的。给定像素坐标(x,y)我想获得以米为单位的3D表面坐标。在我得到将我的图像定义到地面的四个点的坐标后,我将使用它们来获得Homography并在鸟瞰图中扭曲我的图像。

我的代码如下:

int RefEllipsoid = 23;
double UTMNorthing;
double UTMEasting;  
char UTMZone[4];

LLtoUTM(RefEllipsoid, m_uavLat, m_uavLon, UTMNorthing, UTMEasting, UTMZone);  ////Get UTMNNorthing, UTMEasting in meters
std::cout << UTMNorthing << "\t" << UTMEasting << "\n";

cv::Mat source= frame.clone();  
cv::Mat destination;

double f = 500, dist = m_uavAlt;
double alpha, beta, gamma;
alpha = (m_yaw)*(3.1416/180);
beta = (m_pitch)*(3.1416/180);
gamma = (m_roll)*(3.1416/180);  

cv::Size taille = source.size();
double w = (double)taille.width, h = (double)taille.height;

// Rotation matrices around the X,Y,Z axis
///Yaw
cv::Mat RZ = (cv::Mat_<double>(3, 3) <<
    cos(alpha), -sin(alpha), 0,
    sin(alpha),  cos(alpha), 0,
    0,          0,           1);

///Pitch
cv::Mat RY = (cv::Mat_<double>(3, 3) <<
    cos(beta), 0, sin(beta), 
            0, 1,          0,
    -sin(beta), 0,  cos(beta));

///Roll
cv::Mat RX = (cv::Mat_<double>(3, 3) <<
    1,          0,           0,
    0, cos(gamma), -sin(gamma),
    0, sin(gamma),  cos(gamma));

// Composed rotation matrix with (RX,RY,RZ)
cv::Mat R = RZ * RY * RX;   

///translation vector
cv::Mat tvec = cv::Mat::zeros(3,1,cv::DataType<double>::type);
tvec.at<double>(0,0) = UTMEasting;
tvec.at<double>(1,0) = UTMNorthing;  
tvec.at<double>(2,0) = dist;

// Camera Intrisecs matrix 3D -> 2D
cv::Mat A2 = (cv::Mat_<double>(3,3) <<
    f, 0, w/2,
    0, f, h/2,
    0, 0,   1);

cv::Point2f src_vertices[4];
src_vertices[0] = cv::Point2f(0,0);
src_vertices[1] = cv::Point2f(w,0);
src_vertices[2] = cv::Point2f(0,h);
src_vertices[3] = cv::Point2f(w,h);

std::vector<cv::Point2f> ptV;
for(int i=0; i<4; i++) ptV.push_back(src_vertices[i]);

cv::Mat tempMat = (cv::Mat_<double>(3, 4) <<
    R.at<double>(0,0), R.at<double>(0,1), R.at<double>(0,2), tvec.at<double>(0,0),
    R.at<double>(1,0), R.at<double>(1,1), R.at<double>(1,2), tvec.at<double>(1,0),
    R.at<double>(2,0), R.at<double>(2,1), R.at<double>(2,2), tvec.at<double>(2,0));

cv::Point2f dst_vertices[4];

for(int i=0; i<4; i++)
 {
    cv::Mat uvPoint = cv::Mat::zeros(3,1,cv::DataType<double>::type); 
    uvPoint.at<double>(0,0) = ptV.at(i).x; 
    uvPoint.at<double>(1,0) = ptV.at(i).y;
    uvPoint.at<double>(2,0) = -f;

    cv::Mat P = (A2*tempMat).inv(cv::DECOMP_SVD);       

    cv::Mat out = P*uvPoint;

    dst_vertices[i].x = out.at<double>(0,0);
    dst_vertices[i].y = out.at<double>(1,0);        
}

cv::Mat warpMatrix = getPerspectiveTransform(src_vertices, dst_vertices);

cv::Mat rotated;
cv::warpPerspective(source, rotated, warpMatrix, rotated.size(), cv::INTER_LINEAR, cv::BORDER_CONSTANT);

cv::imshow("final", rotated);   
cv::waitKey(0);

不幸的是,我得到的分数远远超出我的预期,它们看起来非常接近,远离无人机位置(以UTM坐标表示)。例如,对于无人机高度400米,我得到的所有图像位置都在(225,-37)左右。 任何人都可以帮助我吗,你对我做错了什么有任何建议吗?

0 个答案:

没有答案