我正在使用Visual leak detector搜索我的程序内存泄漏,但无法找到如何摆脱FT_Load_Char泄漏,文档也没有说明GlyphSlot的内存释放...
这是我的代码snipplet,其中w得到大约350字节的泄漏。
// creating ascii symbol map
for (int i = 32; i < 128; i++) {
if (FT_Load_Char(face, i, FT_LOAD_RENDER)) { // leak comes from here
fprintf(stderr, "Loading character %c failed!\n", i);
continue;
}
glTexSubImage2D(GL_TEXTURE_2D, 0, ox, oy, g->bitmap.width, g->bitmap.rows,
GL_ALPHA, GL_UNSIGNED_BYTE, g->bitmap.buffer);
float ax = g->advance.x >> 6;
float ay = ay = g->advance.y >> 6;
float bw = g->bitmap.width;
float bh = g->bitmap.rows;
float bl = g->bitmap_left;
float bt = g->bitmap_top;
m_GlyphMap[i] = Glyph(ax,ay, bw, bh, bl, bt, ox, oy);
ox += g->bitmap.width + 1;
// there should be some sort of deallociation...
}
所以主要问题:是否有一些功能可以解除分配我丢失的GlyphSlot?或者它是Freetype中的错误?
答案 0 :(得分:4)
确保在关闭程序或停止使用freetype后调用 FT_Done_FreeType(lib _); 。如果不是这种情况,请确保使用的是最新的freetype版本。我有几乎相同的循环,它在Windows 8 x64上运行正常。这是我的代码:
for (UINT32 i = 0; i < text.length(); i++) {
err_ = FT_Load_Char(face_, text[i], FT_LOAD_RENDER);
if (err_) {
LOGW("Unable to select, load and render character."
" Error code: %d", err_);
continue;
}
FT_Bitmap bitmap = glyphSlot->bitmap;
FT_UInt glyphIndex = FT_Get_Char_Index(face_, text[i]);
err_ = FT_Get_Kerning(face_, previous, glyphIndex,
FT_KERNING_DEFAULT, &delta);
if (err_) {
LOGW("Unable to get kerning for character."
" Error code: %d", err_);
continue;
}
Glyph tmp;
tmp.kerningOffset = delta.x >> 6;
tmp.buffer = new UINT8[bitmap.rows * bitmap.width];
memcpy(tmp.buffer, bitmap.buffer, bitmap.rows * bitmap.width);
tmp.height = bitmap.rows;
tmp.width = bitmap.width;
tmp.offsetLeft = glyphSlot->bitmap_left;
if (tmp.offsetLeft < 0) {
tmp.offsetLeft = 0;
}
tmp.offsetTop = glyphSlot->bitmap_top;
tmp.advanceX = glyphSlot->advance.x >> 6;
tmp.advanceY = glyphSlot->advance.y >> 6;
glyphs.push_back(tmp);
previous = glyphIndex;
width += tmp.advanceX + tmp.kerningOffset;
}
如果单独分配符号缓冲区,也不要忘记删除符号缓冲区:
for (SIZE i = 0; i < glyphs.size(); i++) {
Glyph g = glyphs[i];
delete [] g.buffer;
}