OpenGL Programming/Video Capture

apitrace is a nice debugging tool.

It has a save/replay feature that captures everything you told the graphic card, and can reproduce it later on, and even save the result as a stream of pictures.

We'll use this to create a video suitable for upload at Wikicommons!

Compilation (instructions for GNU/Linux)
Get the source code:

Install cmake and the dependencies:

Compile!

Note: For 32bit build on 64bit host:

Capture
Let's test with the simple wave post-processing effect:

Run the program for a few seconds, rotate the object, zoom, etc., and then quit the program.

This will create a  binary file. If the file already exists, it will create, and so on.

Replay
You use the  command, passing it the trace file as parameter:

Of course, this is not interactive anymore, this is just a replay.

Convert to video
Let's install ffmpeg:

Note: we tried to use ffmpeg's scaling filter to reduce the screen size, but sadly it caused a segfault however we tried. So instead we just recompiled the application and specified a smaller screen size directly.

ffmpeg's options are organized the following way: ffmpeg -i source destination


 * We'll use a  output since that's the only format accepted by Wikicommons (if you know who could get support for the free and better .webm format, please contact him/her!).
 * The screen used during the tutorial has a refresh rate of 75Hz, but videos usually are 25 images/s, so we'll reduce the rate (double input and ouput  parameters)
 * We'll use a fixed, good quality
 * We overwrite the destination file

We get:

We've got our video - and no additional code was needed.

Here's the result!

= WebGL variant =

When running a WebGL application, capturing the browser didn't work well for us: performance was poor, and it captured the whole Firefox window - not just the animation.

So we implemented an internal capture system:

Time control
One pro of doing the capture manually is that you can control the time flow as slow as you want, and hence avoid any performance issue during the capture. We did so by adding a little wrapper around the time function (in our case, threejs'):

We decided to use a 30FPS frame rate, which is common in videos (as of 2013).

Copy the WebGL frame
Use your WebGL canvas'  method :

This returns a base64-encoded image in the form.

Export using AJAX
JavaScript cannot save local files directly, so we'll export these images to a webserver using Ajax :

Minimal web server
We can write a minimal webserver that will just store the images we exported, using NodeJS:

You can run it with :

Assemble the video
This step is similar to the one with glretrace above. We'll just use a special syntax to grab all the PNG files:

(By the way, avconv is a fork/alternative to ffmpeg, with nearly similar command-line options.)

We're done: we've got a perfect-sync capture of our WebGL animation!

= Going further =

Contribution on how to capture synchronized application audio streams would be welcome :)

= References =