我已经可以使用在OpenCV中实现的Lucas Kanade跟踪SIFT或SURF功能,因为Lucas Kanade以任何方式跟踪稀疏功能,但是我尝试使用Farneback在OpenCV中实现的光流算法来跟踪那些稀疏功能,是否有算法那个?
Realtime Dense Optical flow tracking
请查看此视频:
开发人员声称他们使用密集方法“Farneback”而不是稀疏方法“Lucas-Kanade”跟踪选定的稀疏特征。他们是怎么做到的?
答案 0 :(得分:3)
要跟踪具有密集光流场flow
的要素,可按如下方式进行:
// read images
cv:Mat prevImg = cv::imread( filename0 ); // image data at time t
cv::Mat currImg = cv::imread( filename1 ); // image data at time t and t + 1
cv::Mat flowMat; // storage for dese optical flow field
std::vector<cv::Point2f> prevPoints; // points to be track
// initialize points to track (example)
prevPoints.push_back( cv::Point2f( 50.3f, 30.f ) );
std::vector<cv::Point2f> currPoints( prevPoints.size()); // tracked point position
// compute dense flow field (example)
cv::calcOpticalFlowFarneback(prevImg, currImg, flowMat, 0.4, 1, 12, 2, 8, 1.2, 0);
// track points based on dense optical flow field and bilinear interpolation
for( unsigned int n = 0; n < prevPoints.size(); ++n )
{
float ix = floor(prevPoints[n].x);
float iy = floor(prevPoints[n].y);
float wx = prevPoints[n].x - ix;
float wy = prevPoints[n].y - iy;
float w00 = (1.f - wx) * (1.f - wy);
float w10 = (1.f - wx) * wy;
float w01 = wx * (1.f - wy);
float w11 = wx * wy;
if( prevPoints[n].x >= flowMat.cols - 1 || prevPoints[n].y >= flowMat.rows - 1)
{
// these points are out of the image roi and cannot be tracked.
currPoints[n] = prevPoints[n];
}
else
{
/*
bilinear interpolation of the flow vector from the flow field at a given location.
The bilinear interpolation has to be applied since the points to track can be given at subpixel level
*/
currPoints[n] = prevPoints[n]
+ flowMat.at<cv::Point2f>(iy, ix) * w00
+ flowMat.at<cv::Point2f>(iy+1, ix) * w10
+ flowMat.at<cv::Point2f>(iy, ix+1) * w01
+ flowMat.at<cv::Point2f>(iy+1, ix+1) * w11;
}
}
}
答案 1 :(得分:0)
OpenCV中有一个函数calcOpticalFlowFarneback()可以做到这一点。