glBeginQuery GL_OUT_OF_MEMORY错误

时间:2015-11-12 08:33:03

标签: c++ opengl

我正在使用glQuery获取有关我的应用程序中FPS计数的信息:

CollectDataBegin();

/*all drawing operations with OpenGL*/

CollectDataEnd();

其中:

void RenderingInfo::CollectDataBegin()
{
    //FPS begin
    available = 0;
    GLenum eError;
// UPDATE 1 START
        if (!bQueryGenerated){ 
            glGenQueries(1, queries); 
            bQueryGenerated = true;
        }
// UPDATE 1 END
    //GL_NO_ERROR from glGetError();
    glBeginQuery(GL_TIME_ELAPSED, queries[0]);
    //GL_OUT_OF_MEMORY error from glGetError();
    //FPS end
}

void RenderingInfo::CollectDataEnd()
    {
        //FPS begin
        glEndQuery(GL_TIME_ELAPSED);

        iFramesCount++;

        if (iFramesCount == 20)
        {
            iFramesCount = 0;

            while (!available) {
                glGetQueryObjectiv(queries[0], GL_QUERY_RESULT_AVAILABLE, &available);
            }

            glGetQueryObjectui64v(queries[0], GL_QUERY_RESULT, &timeElapsed);

            float jeb = static_cast<float>(timeElapsed) / std::pow(10, 9);

            xRenderStats.fFPS = static_cast<float>(1.0 / jeb);

                sFPS = std::to_string(xRenderStats.fFPS);
// UPDATE 1 START
                if (bQueryGenerated){
                    glDeleteQueries(1, queries);
                    bQueryGenerated = false;
                }
// UPDATE 1 END
        }
}

RenderingInfo班级的私人成员:

GLuint queries[] = {0};
GLint available = 0;
GLuint64 timeElapsed;
int iFramesCount = 0;
bool bQueryGenerated = false; //UPADTE 1

我用最新的驱动程序使用Nvidia GeForce GTX760编写并测试了这段代码,我完全没有问题。 但在切换到我的集成英特尔高清显卡4600后,我在致电GL_OUT_OF_MEMORY后收到glBeginQuery()。有趣的是,我没有立即收到此错误,但在拨打glBeginQuery()之后。

我找不到任何与此事有关的帖子,所以我要求你帮忙解决这个问题。

更新1: 我根据@Ike建议修改了我的代码,但我仍然收到GL_OUT_OF_MEMORY错误。

1 个答案:

答案 0 :(得分:0)

After removing the code associated with glQuery my app no longer produces GL_OUT_OF_MEMORY errors. Since I was using this funcionality for counting time in which single frame is rendered, I replaced it with more reliable method:

void RenderingInfo::CollectDataBegin()
{
    //FPS begin
    ctTimeBegin = clock();
    //FPS end
}

void RenderingInfo::CollectDataEnd()
    {
    //FPS begin
    ctTimeEnd = clock();

    dElapsedTime += (static_cast<double>((ctTimeEnd - ctTimeBegin))/CLOCKS_PER_SEC);

    iFramesCount++;

    if (iFramesCount == 20)
    {
        if ((dElapsedTime / iFramesCount) < (1.0 / CLOCKS_PER_SEC)){
            xRenderStats.fFPS = 60.0f;
        }
        else{
            xRenderStats.fFPS = static_cast<float>(iFramesCount / dElapsedTime);
        }

        sFPS = std::to_string(xRenderStats.fFPS);

        dElapsedTime = 0.0;
        iFramesCount = 0;
    }
    //FPS end
}

Private RenderingInfo members:

int iFramesCount =0;
clock_t ctTimeBegin = 0;
clock_t ctTimeEnd = 0;
double dElapsedTime = 0.0;

This is not the answer for a question why I received GL_OUT_OF_MEMORY, but it's for showing a possible way out when someone will end up with the similar problem.