opencv的calcHist返回的总像素超出了图像分辨率

时间:2016-11-12 01:21:25

标签: c++ opencv colors histogram

我正在使用c ++中的opencv处理32位(RGBA)图像。我想要计算从0到255的每个颜色级别的像素数。因此,对于黑色1920x1080图像,我的输出目前看起来像这样:

(B, G, R)
0 = (2073600, 2073600, 2073600)
1 = (0, 0, 0)
2 = (0, 0, 0)
3 = (0, 0, 0)
...
252 = (0, 0, 0)
253 = (0, 0, 0)
254 = (0, 0, 0)
255 = (0, 0, 0)

白色,看起来像这样:

(B, G, R)
0 = (0, 0, 0)
1 = (0, 0, 0)
2 = (0, 0, 0)
3 = (0, 0, 0)
...
252 = (0, 0, 0)
253 = (0, 0, 0)
254 = (0, 0, 0)
255 = (2073600, 2073600, 2073600)

图像中的总像素数是1920 * 1080 = 2073600,并且在这些情况下,0,255的所有像素的总和不超过它。然而,问题是当我有一个纯红色图像,一个像素被修改为254而不是255,我得到以下结果:

(B, G, R)
0 = (2073600, 2073600, 0)
1 = (0, 0, 0)
2 = (0, 0, 0)
3 = (0, 0, 0)
...
252 = (0, 0, 0)
253 = (0, 0, 0)
254 = (0, 0, 1)
255 = (0, 0, 2073600)

红色通道中的总像素数:2073601,而不是2073600.我需要直方图表示不超过图像中的总像素数。

以下是代码:

cv::Mat getHist(std::string filename) {
    cv::Mat img;
    img = cv::imread(filename, CV_LOAD_IMAGE_COLOR);
    if (!img.data) {
        std::cout << "Problem with source\n";
        return cv::Mat();
    }

    std::vector<cv::Mat> bgr_planes;
    cv::split(img, bgr_planes); //split source image data into bgr planes vector array [0],[1], and [2]
    int histSize = 256; //from 0 to 255 (8 bit)
    float range[] = {0, 256}; //initialize range[] array with two values, 0 and 256, the upper boundary is exclusive.
    const float* histRange = {range}; 
    bool uniform = true;
    bool accumulate = false;
    int channels[] = {0};

    cv::Mat b_hist, g_hist, r_hist;
    cv::calcHist(&bgr_planes[0], 1, channels, cv::Mat(), b_hist, 1, &histSize, &histRange, uniform, accumulate); //1 parameter means only 1 image. cv::Mat() means no Mask
    cv::calcHist(&bgr_planes[1], 1, channels, cv::Mat(), g_hist, 1, &histSize, &histRange, uniform, accumulate);
    cv::calcHist(&bgr_planes[2], 1, channels, cv::Mat(), r_hist, 1, &histSize, &histRange, uniform, accumulate);


    int hist_h = img.rows*img.cols;
    int hist_w = 256;
    int bin_w = cvRound( (double)hist_w/histSize);

    cv::Mat histImage(hist_h, hist_w, CV_8UC3, cv::Scalar(0,0,0));
//    cv::Mat histImage(hist_h, hist_w, CV_32F, cv::Scalar(0,0,0));

    // normalize the histogram so values fall in the range indicated by the parameters entered.
    // normalize the result to [ 0, histImage.rows ]

    cv::normalize(b_hist, b_hist, 0, histImage.rows, cv::NORM_MINMAX, -1, cv::Mat()); // b_hist is input array and the output normalized array, okay if they are the same.
    cv::normalize(g_hist, g_hist, 0, histImage.rows, cv::NORM_MINMAX, -1, cv::Mat());
    cv::normalize(r_hist, r_hist, 0, histImage.rows, cv::NORM_MINMAX, -1, cv::Mat());

    int rsum = 0;
    for (int i = 0; i < histSize; i++) {
//        std::cout << hist_h - cvRound(b_hist.at<float>(i));
        std::cout << i << " = (";
        std::cout << cvRound(b_hist.at<float>(i)) << ", ";
        rsum += cvRound(r_hist.at<float>(i));
        std::cout << cvRound(g_hist.at<float>(i)) << ", ";
        std::cout << cvRound(r_hist.at<float>(i)) << ") \n";
    }
    std::cout << "Red channel pixel sum: " << rsum;
    std::cout << " Resolution " << img.rows << "x" << img.cols << " == " << img.rows*img.cols << "\n";
    std::cin.ignore();


    return histImage;
}

0 个答案:

没有答案