OpenGL Programming/Post-Processing

Post-processing are effects applied after the main OpenGL scene is rendered.

Technical overview
To apply a global effect on the whole scene, we face a limitation: all the shaders work locally: vertex shaders only know about the current vertex, and fragment shaders only know about the current pixel.

The only exception is when working with textures: in this case, we can access any part of the texture using texture coordinates.

So the idea for post-processing is to first render the whole scene in a texture, and then render this single texture to screen with the post-processing.

Two main alternatives exist:
 * render the screen a first time, then copy the screen to a texture using
 * render directly to a texture through a framebuffer object

We'll use the second method, which should be more efficient, and can render on an area bigger than the physical screen if necessary.

(The first method may be necessary if you plan to use the stencil buffer as well.)

Framebuffer
We will create:
 * a framebuffer object
 * with a depth buffer stored in a render buffer (necessary to render a 3D scene)
 * a color buffer stored in a texture (with GL_CLAMP_TO_EDGE to avoid default GL_REPEAT's border "warping" effect).

Vertices
Then we'll need a basic set of vertices to display the resulting texture on screen. In this example we'll only use 2D coordinates because we plan to make a 2D effect, but feel free to use 3D coordinates for a 3D effect (mapping the texture on a rotating cube like Compiz, for instance):

Program
Now we'll need a separate program for our post-processing effect. It's a lot of code, but it's a mere copy/paste from the basic tutorials :)

Drawing
We've got all our pre-requisites, so now how do we draw to the texture?

In, let's add:

We've changed the destination framebuffer to our own framebuffer, drawn the scene (to its texture), and then switched back to the physical screen's framebuffer.

Now we can display the texture on screen, using our new program:

Gotchas
When using multiple programs, make sure you set the rendering state to use the correct program using  before setting your uniforms. In particular, in our  routine below, we set the rendering state to use our   program, then added a call to , so in your rendering code, you will need set the rendering state to your program before setting your uniforms, or you'll get a blank screen due to a missing MVP matrix (and OpenGL won't tell you).

In case the resolution of your texture and your screen differ, adjust the viewport size accordingly, using.

Shaders
First, let's implement an identity (no-change) shader, we'll modify it later to create a first effect.

We chose not to pre-compute the texture coordinates, so the vertex shader will do it: Nothing fancy.

Now the fragment shader will be able to pick pixels anywhere we want in the texture - we're not restricted to the current pixel anymore!

A first effect
Let's implement a very basic post-processing effect: a static wave on the screen, using the  function. There is a similar (but more complex) effect in God of War III during the Poseidon Hippocamp's water breathing attack.

The idea is to postpone the x axis regularly, changing progressively on the y axis:

We have 4 vertical sin waves, and its amplitude is 1/100 of the screen width.

is used to animate, by changing the starting point of the  function, we define this uniform as:

We've done our first post-processing effect!

Links

 * SFML (a 2D game library) provides a post-effect system implementing this technique