OpenGL Programming/Android GLUT Wrapper

Our wrapper: Making-of

If you plan to write your own OpenGL ES 2.0 application, here are some tips on how the wrapper does it:

Writing C/C++ code for Android
Android's applications are written in Java, but they can call C/C++ code using JNI (Java Native Interface), which in Android is presented as the NDK (Native Development Kit).

You can either:
 * Write both a Java wrapper and C++ code:
 * Available since Android 1.5
 * The C++ code may interact with an OpenGL ES context created by Java
 * Creating an OpenGL ES 2.0 context (with EGL) directly from C++ requires Android 2.3/Gingerbread/API android-9
 * OpenGL ES 2.0 available since Android 2.0/API android-5
 * Example: NDK's hello-gl2 sample
 * From rely on the built-in "NativeActivity" java wrapper, and only write C++ code:
 * Available since Android 2.3/Gingerbread/API android-9
 * Use EGL to create the OpenGL ES context
 * Example: NDK's native-activity sample (it's OpenGL ES 1.x, but can easily be upgraded)

Native Activity details
Android 2.3/Gingerbread/API android-9 introduces native activities, which allows to write an application without any Java.

While the sample mentions a default API level of, while it should be.

Also, make sure your manifest has: otherwise the application won't start.

Your entry point is the  function (instead of the more common   or  ). For portability, you could rename it at the pre-processor level using.

Build system
The wrapper is based on the native-activity sample. It uses the 'android_native_app_glue' code that deals with non-blocking Android events processing.

Since we don't call directly the glue code (its entry points are callbacks used by Android, not us),  may be stripped by the compiler, so let's call its dummy entry point:

It uses OpenGL ES 2.0 (rather than the sample's OpenGL ES 1.X):

To use GLM, we need to enable the C++ STL: and advertise its install location:

We now can declare our source files (tut.cpp):

To run the build system:
 * Compile the C/C++ code
 * Prepare the Java build system (only once):
 * Create the .apk package:
 * Install it:
 * Clean:

We included these commands in the wrapper.

Creating the OpenGL ES context with EGL
We need to tell EGL to create an OpenGL ES with version 2.0 (not 1.x).

Firstly when requesting the available contexts: Secondly when creating the context:

(In Java code:)

It is good practice, but not mandatory, to declare GLES 2.0 requirement in your :

When the user goes to the home (or receives a call), your application is paused. When the user goes back to your application, it's unpaused, but the OpenGL context may be lost. In this case, you need to reload all the GPU-side resources (VBOs, textures, etc.). There is an Android event to detect when your application is un-paused.

Similarly when the user presses the Back button, the application is destroyed, but it still resides in memory and may be restarted.

For our wrapper, we considered that GLUT applications are generally not designed to resume the OpenGL context, let alone reset all statically-assigned variables. Consequently, the application just exits completely when the context is lost - just like when the application window is closed on desktops.

Android Events
Even if we write native code, our application is still started through a Java process, using the android.app.NativeActivity built-in activity. That process is responsible for receiving device events and forwarding them to our app.

Workflow:
 * The Android OS sends an event to the NativeActivity Java process
 * The Java Activity framework calls the appropriate Activity callback functions (e.g. such as )
 * NativeActivity calls its JNI matching function in android_app_NativeActivity.cpp (e.g. )
 * calls the matching NativeCode callback in  (e.g.  )
 * write a message through a C  (e.g.  ), and returns immediately so that the Java process doesn't get stuck (otherwise the user would be offered to kill it)
 * in our native app, on a regular basis, we check the event queue and call 's   (or  )
 * going back one level up in, where   it executes a pre-event and a post-event generic hook, and in-between calls our app   callback
 * back down in our app, where the  hook (e.g.  ) processes the event at last!

Resources/Assets
Android applications typically extract resources (such as shaders or meshes) from their .apk file (which is really a Zip archive).
 * resources are located in res/ sub-folders (e.g. res/layout/); there are Android functions to load them depending on their type
 * assets are located in the assets/ folder and are accessed through a more traditional directory structure

That's not common for GLUT applications, so let's try to make resources available transparently:
 * using a wrapper around fopen/open
 * loaded with LD_PRELOAD, such as zlibc
 * using the kernel ptrace hooks
 * redefining fopen in our .cpp file
 * extracting files beforehand

Using a fopen/open wrapper is tedious to implement, because our application is called through JNI. This means we cannot just  another application after setting LD_PRELOAD. Instead, we'd need to start child process, forward it all the Android events, and setup an IPC to share the  and   data structures. ptrace also requires a child process.

Redefining fopen locally would work for C, but not for C++.

Pre-extracting assets requires additional disk space to store the files, but is the more reasonable solution.

Accessing assets
Developers have been struggling to access resources easily in the NDK:
 * Android API : you can call the Android Java functions through JNI, but getting a file descriptor requires using unofficial functions and only work on uncompressed files; using Java buffer operations instead is quite tedious C/C++
 * libzip : you can easily access the .apk with libzip, though you need to integrate the library in your build system
 * NDK API : in Android 2.3/Gingerbread/API android-9 at last, there is an NDK api to access resources

Let's use the NDK API. It is not transparent for the developer either (no fopen/cout replacement) but is reasonably easy to use.

What's a bit more tricky is to grab the AssetManager from Java/JNI in our native activity.

Note: we'll use the slightly simplified C++ syntax for JNI (not the C syntax).

First, our native activity works in its own thread, so we need care when retrieving the JNI handle in :

Then let's get a handle on our calling NativeActivity instance:

We also need to decide where to extract the files. We'll use the application's standard cache directory:

We now can get the NativeActivity AssetManager:

The actual extraction is simple: browse all files and copy them on disk one by one:

Now, all files can be accessed using plain fopen/cout by the application.

This technique is adapted to our tutorials, but probably not for bigger applications. In this case, you could either:
 * request write privilege on the SD card and extract files there (that's what the SDL Android port does),
 * use a wrapper around your file accesses that uses the AssetManager on Android (beware that it's read-only access)

Orientation
By setting: Your application only works in portrait mode, independently of the device orientation or shape. This is not recommended but this may be useful for some games.

To handle orientation more efficiently, you theoretically need to check for  events. The  handler in   wrapper doesn't seem to create a   event appropriately on orientation change, so instead we'll just monitor it regularly: Now we can process the event:

Note: it is possible to process  events, but it happens before the screen is resized, so it's too early to get the new screen size.

Android can only detect the new screen size after a buffer swap, so let's abuse another hook to get a resize event:

Input events
We reuse  from the native-activity sample.

It's important to  when the event is not directly handled, so that the Android system does it. For instance we usually let Android take care of the Back button.

The NativeActivity framework doesn't seem to send appropriate repeat events: the key is pressed and unpressed at the exact same time, and the repeat count is always 0. Consequently it doesn't seem possible to process arrow keys from Hacker's Keyboard without rewriting part of the framework.

Motion (touchscreen) and keyboard events are handle through the same channel.

The allow users without keyboard to use the arrow keys, we implemented a virtual keypad (VPAD), located in the bottom-left corner, activated on touchscreen. Effort was made to avoid mixing a VPAD event with an existing motion event and vice-versa.

= References =

in your NDK installation directory: details on the building process
 * http://developer.android.com/guide/topics/manifest/manifest-element.html : AndroidManifest.xml reference
 * http://developer.android.com/guide/developing/device.html : official documentation on connecting your device with USB
 * http://developer.android.com/reference/android/app/NativeActivity.html : official documentation on Java-less apps
 * http://developer.android.com/sdk/ndk/ : Android NDK, Revision 5 (December 2010) introduces native activities
 * http://blog.tewdew.com/post/6852907694/using-jni-from-a-native-activity : Using JNI from a Native Activity

Source code for the NativeActivity built-in:
 * android_app_NativeActivity.cpp
 * NativeActivity.java