Specifying the Texture in OpenGL

After drawing the texture, call the OpenGL function glTexImage2D() (There are also versions of this function for one and three-dimensional textures) to specify the texture image. This function takes nine arguments.

  • The target texture, which will normally be GL_TEXTURE_2D.
  • The level of detail number. Pass 0 to use the base image. Numbers greater than 0 tell OpenGL to use a mipmap, which is a smaller version of the texture. Higher values indicate smaller mipmaps.
  • The internal format, which specifies the number of color components in the texture.
  • The width of the texture.
  • The height of the texture. For the highest level of compatibility the width and height should be a power of 2.
  • The width of the texture’s border, which will be either 0 or 1. If you have a border width of 1, you must add 2 to the texture’s width and height.
  • The format, which specifies the pixel data.
  • The data type of the pixel data.
  • The image data.

For the sample code I included in this article, I used the internal format GL_RGBA8. This format has 8 bits of red, green, blue, and alpha. GL_RGBA8 works well on Mac OS X if you’re using 32-bit color.

The most common Mac offscreen buffer pixel format is k32ARGBPixelFormat, which has 8 bits of alpha, red, green, and blue. There is no GL_ARGB format in OpenGL. If you use k32ARGBPixelFormat, your format must be GL_BGRA, which is the reverse of ARGB. On a PowerPC Mac you must use the data type GL_UNSIGNED_INT_8_8_8_8_REV, which reverses the bytes. Intel Macs don’t need to reverse the bytes so they use the data type GL_UNSIGNED_INT_8_8_8_8.

#if __BIG_ENDIAN__
	textureMap.SetType(GL_UNSIGNED_INT_8_8_8_8_REV);
#else
	textureMap.SetType(GL_UNSIGNED_INT_8_8_8_8);
#endif

The following code demonstrates the call to glTexImage2D() on Mac OS X:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0,
    GL_BGRA, type, imageData);

Next (Cleaning Up)

Previous (Drawing the Texture)