GLSL Programming/Blender/Textured Spheres

This tutorial introduces texture mapping.

It's the first in a series of tutorials about texturing in GLSL shaders in Blender. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.



Texture Mapping
The basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.

To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). The horizontal coordinate is officially called  and the vertical coordinate. However, it is very common to refer to them as  and. In animation and modeling tools, texture coordinates are usually called  and.

In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.

In OpenGL, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “texture2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “wrap mode”. For example, wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: $$(0,0)$$ specifies the lower, left corner of the texture image; $$(1,0)$$ the lower, right corner; $$(0,1)$$ the upper, left corner; etc. The OpenGL's wrap mode corresponds to Blender's settings under Properties > Texture tab > Image Mapping. Unfortunately, Blender doesn't appear to set the OpenGL wrap mode but it is always set to “repeat”.

Texturing a Sphere in Blender
To map the image of the Earth's surface to the left onto a sphere in Blender, you first have to download this image to your computer: click the image to the left until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Blender and add a sphere (in an Info window choose Add > Mesh > UV Sphere), select it in the 3D View (by right-clicking), activate smooth shading (in the Tool Shelf of the 3D View, press t if it is not active), make sure that Display > Shading: GLSL is set in the Properties of the 3D View (press n if they aren't displayed), and switch the Viewport Shading of the 3D View to Textured (the second icon to the right of the main menu in the 3D View). Now (with the sphere still being selected) add a material (in a Properties window > Material tab > New). Then add a new texture (in the Properties window > Textures tab > New) and select Image or Movie for the Type and click Image > Open. Select your file in the file browser and click on Open Image (or double-click it in the file browser). The image should now appear in the preview section of the Textures tab and Blender should put it onto the sphere in the 3D View.

Now you should make sure that the Coordinates in the Properties window > Textures tab > Mapping are set to Generated. This means that our texture coordinates will be set to the coordinates in object space. Specifying or generating texture coordinates (i.e. UVs) in any modeling tool is a whole different topic which is well beyond the scope of this tutorial.

With these settings, Blender will also send texture coordinates to the vertex shader. (Actually, we could also use the object coordinates in  because they are the same in this case.) Thus, we can write a vertex shader that receives the texture coordinates and hands them through to the fragment shader. The fragment shader then does some computation on the four-dimensional texture coordinates to compute the longitude and latitude (scale to the range from 0 to 1), which will be used as texture coordinates here. Usually this step would be unnecessary since the texture coordinates should already correctly specify where to look up the texture image. (In fact, any such processing of texture coordinates in the fragment shader should be avoided for performance reasons; here I'm only using this trick to avoid setting up appropriate UV texture coordinates.) The Python script to set up the shader could be:

Note the last line

in the Python script: it sets the uniform variable  to 0. This specifies that the texture should be used which is first in the list in the Properties window > Textures tab. A value of 1 would select the second in the list, etc. In fact, for each  variable that you use in a fragment shader, you have to set its value with a call to   in the Python script as shown above. Actually, a  uniform specifies the texture unit of the GPU. (A texture unit is a part of the hardware that is responsible for the lookup and interpolation of colors in texture images.) The number of texure units of GPUs is available in the built-in constant, which is usually 4 or 8. Thus, the number of different texture images available in a fragment shader is limited to this number.

If everything went right, the texture image should now appear correctly mapped onto the sphere when you start the game engine by pressing p. (Otherwise Blender maps it differently onto the sphere.) Congratulations!

How It Works
Since many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:

The vertices of Blender's sphere object come with attribute data in  for each vertex, which specifies texture coordinates that are in our particular example the same values as in the attribute , which specifies a position in object space.

The vertex shader then writes the texture coordinates of each vertex to the varying variable. For each fragment of a triangle (i.e. each covered pixel), the values of this varying at the three triangle vertices are interpolated (see the description in “Rasterization”) and the interpolated texture coordinates are given to the fragment shader. In this particular example, the fragment shader computes new texture coordinates in. Usually, this wouldn't be necessary because correct texture coordinates should be specified within Blender using UV mapping. The fragment shader then uses the texture coordinates to look up a color in the texture image specified by the uniform  at the interpolated position in texture space and returns this color in , which is then written to the framebuffer and displayed on the screen.

It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.

Summary
You have reached the end of one of the most important tutorials. We have looked at:
 * How to set up a Blender object for texturing.
 * How to import a texture image.
 * How a vertex shader and a fragment shader work together to map a texture image onto a mesh.