为什么GLsizei没有定义为无符号?

时间:2012-01-25 01:18:02

标签: opengl types opengl-es

我在iOS上查找typedef GLsizei的OpenGL ES 1.1实现,并惊讶地发现它被定义为int。一些快速的谷歌搜索显示这是正常的。 (包括普通的OpenGL。)

我原以为将其定义为unsigned intsize_t。为什么它被定义为香草int

1 个答案:

答案 0 :(得分:5)

似乎不太可能是一个问题,除非你有任何4GB的数据结构。

以下是某人的回答:http://oss.sgi.com/archives/ogl-sample/2005-07/msg00003.html

Quote:

(1) Arithmetic on unsigned values in C doesn't always yield intuitively
correct results (e.g. width1-width2 is positive when width1<width2).
Compilers offer varying degrees of diagnosis when unsigned ints appear
to be misused.  Making sizei a signed type eliminates many sources of
semantic error and some irrelevant diagnostics from the compilers.  (At
the cost of reducing the range of sizei, of course, but for the places
sizei is used that's rarely a problem.)

(2) Some languages that support OpenGL bindings lack (lacked? not sure
about present versions of Fortran) unsigned types, so by sticking to
signed types as much as possible there would be fewer problems using
OpenGL in those languages.

这两种解释似乎都是合理的 - 我在愚蠢地使用NSUInteger作为循环计数器的时候曾多次遇到过这种情况(提示:不要这样做,特别是当向后计数到零时)。