Sunday 16 March 2008

Water reflections with OpenGL

I've always wanted to implement water waves with nice reflections, even more so, when HalfLife 2 came out, followed by Bioshock, Crysis... The water stuff in those games looked just so cool I couldn't stop but to feel like I'm falling behind with my OpenGL skills. Unfortunately, I never really had time to focus on that, since I was so heavily into 2D vector graphics stuff.



Luckily, this year at uni we got this awesome assignment for Interactive 3D subject, where we have to generate a water surface with some random waves going on and a boat that is believably wobbling as the waves pass it. The assignment doesn't necessarily require reflections on the water surface, but we do get bonus marks for making it look cooler, so I thought, well, why not do it the proper way!

In the end implementing all this was much easier than I thought it would be. For a start, you need to render a reflection of the whole scene (less the water surface). Since the water usually lies in the horizontal (XZ) plane, this is no harder than multiplying the modelview matrix with a negative vertical scale (-1 value for Y coordinate). If the water surface is not at height 0 or the reflective surface (e.g. mirror) lies in a different plane, just do whatever you need, to get the scene mirrored over that plane. Also, remember to respecify the light locations after that to mirror them properly, as well as change the culling face from GL_FRONT to GL_BACK (if you are doing backface culling at all) since mirroring reverses the orientation of the polygon vertices.


If there was no waves on the water surface, that would be it. Just draw the water plane over it with a half-transparent blueish color and you are done. To be able to displace the reflection image with waves, though, you need to render the mirrored scene to a texture. You can either do it with a framebuffer object or (what I find more handy) just do a glCopyTexSubImage2D. If you don't want to waste memory and are willing to sacrifice some reflection image resolution (with lot's of waves no one will notice anyway), you can create a smaller texture than the window size, but don't forget to respecify the glViewport to fit the size of the texture.


Now that you've got the reflected image in a texture, it is time to write some shaders. Basically, what you want to achieve is deform the way the texture is being draw to the screen. To get it really nice this has to be done per-pixel in a fragment shader. A direct copying of the texture would mean, that each fragment takes the pixel from the texture exactly at it's window coordinate (gl_FragCoord.xy). To get the wobbly effect, this reading has to be offset by a certain amount depending on the normal of the surface at current pixel. These normals might come from a normal map or (as in my case) computed in the shader itself (first derivative of the wave function).

The really tricky part here is finding the right way, how to transform surface normals into texture sampling offset. Keep in mind that (as opposed to real ray-tracing) this technique is purely an illusion and not a proper physical simulation, so anything that makes it at least look right might do. Here I will give you my solution which is based more on empirical experimentation rather than some physical-matematical approach.

The first invariant is quite obvious - the pixels with normal equal to the undeformed mirror surface normal must not deform the image in any way (a totally flat mirror should just copy the texture to the screen). So what I do is take in cosideration just the part of the normal vector that offsets it away from the mirror normal (the projection of the normal vector onto the surface plane). The easiest way to get it is subtract out the projection of the normal onto the mirror plane normal:

vec3 flatNormal = waveNormal - dot (waveNormal, mirrorNormal) * mirrorNormal


Next thing I do is project the flattened normal into eye space. The XY coordinates of the result tell us how we see that normal on the screen (perspective projection not taken into account).

vec3 eyeNormal = gl_NormalMatrix * flatNormal

We almost got it now. We could use XY coordinates to offset the readings into the reflection texture, but there is one problem with that. When the camera gets really close to the water surface, the projected image of the flat normals around the center of the view gets really small and thus the image is more distorted at sides than in the middle. To correct this, I normalize the projected normal image to have it only as a direction and use the initial length of the unprojected normal as a scalar (note that this value is zero when normal equals mirror normal and goes toward 1 when it's perpendicular to it - or rather it is a cosine of the angle between normal and the mirror plane).

vec2 reflectOffset = normalize (eyeNormal.xy) * length (flatNormal) * 0.1

The 0.1 factor is there just to scale down the whole distortion. You can adjust it to your liking - just keep in mind that the greater it is, the more the reflection image will be distorted.