使用openCV分隔图像背景

时间:2015-03-23 16:33:33

标签: image opencv background

实际上,我正试图从这个image中减去背景。显然,我只想减去绿色背景,这是我正在使用的代码:

Mat img_object = imread(patternImageName);
Mat imageInHSV;
cvtColor(img_object, imageInHSV, CV_BGR2HSV);

Mat chan[3],imgThreshed, processed;
split( imageInHSV, chan );
Mat H = chan[0];
// compute statistics for Hue value
cv::Scalar mean, stddev;
cv::meanStdDev(H, mean, stddev);

// ensure we get 95% of all valid Hue samples (statistics 3*sigma rule)
float minHue = 80;
float maxHue = 95;
cout << "MinValue :" << mean[0] << " MaxHue:" << stddev[0] << endl;
cout << H << endl;
// STEP 2: detection phase
cv::inRange(H, cv::Scalar(minHue), cv::Scalar(maxHue), imgThreshed);
imshow("thresholded", imgThreshed);

我检查了通道H的值来决定minHue和maxHue,所以我选择了矩阵中最频繁值的间隔,这绝对是绿色值。但是,我得到了这个result,这是不是我正在寻找的,因为它缺少东西。知道怎么改进吗?如何更好地从这种图像中减去背景?

1 个答案:

答案 0 :(得分:0)

我不确定你的目标是什么。但是,使用[mean-stddev,mean+stddev]的范围,我从其他两个通道(饱和度和亮度而不是色调)得到的样本图像效果更好。平均所有三个通道的结果显示出一些改进:

using namespace std ;
using namespace cv ;

int main()
{
    Mat img_object = imread("1.png");
    Mat imageInHSV;
    cvtColor(img_object, imageInHSV, CV_BGR2HSV);
    Mat chan[3];
    split( imageInHSV, chan );
    Mat result ;
    Mat threshImg[3] ;

    for(int i=0 ; i<3 ; i++)
    {
        Mat H = chan[i];

        // compute statistics for each channel
        cv::Scalar mean, stddev;
        cv::meanStdDev(H, mean, stddev);

        // statistically 68% of data should be in this range
        float minVal = mean[0]-stddev[0];
        float maxVal = mean[0]+stddev[0];
        cout << "MinValue :" << mean[0] << " MaxHue:" << stddev[0] << endl;

        // Separating the dominant 68% which we guess should be the background.
        cv::inRange(H, cv::Scalar(minVal), cv::Scalar(maxVal), threshImg[i]);
    }

    // averaging the results from three different channels (Hue, Saturation, lightness). 
    result = (threshImg[0]+threshImg[1]+threshImg[2])/3 ;
    imwrite("thresholded_012.jpg", result) ;
}

输入图片: enter image description here

输出图片: enter image description here