lucy in the deep
Photon Mapping, Subsurface Scattering (translucency), Participating Media/Volumetric Scattering (fog-like effects), Phillips Spectrum FFT Waves (water), Perlin and Worley-based Noise
Craig Donner
cdonner@graphics.ucsd.edu
For this project I wrote a ray-tracer from scratch, which I humbly named "Hikaru", which means "ray" or "light." No LRT or Miro code here.... It's all mine. Hikaru managed to grow from a somple ray tracer to a full global illumination renderer, all in the course of 10 weeks. It also makes a kick-ass bounding-volume hierarchy ^_^, and a BSP tree as well. There were a few optimization tweaks (fewer shadow rays the further from the camera you are, shorter step-sizes for rays that trace deeper into the scene, etc.). The final image was rendered using a AABB hierarchy, with over 500,000 triangles. It took about 5 hours to render on a P4 2.4 GHz using around 250 MB of ram.
This project somehow managed to involve the following:
This scene actually doesn't require that much global illumination, I used photon maps more for translucency (see below) than anything else. Henrik's book is probably the best reference.
I'll add this to the photon mapping section, since it seems fitting, and it doesn't really deserve it's own section. I implemented Greg Ward's irradiance caching and irradiance gradients algorithms; they can speed up gathering by as much as a factor of 100 (or more!).
Translucent materials include marble, jade, skin, plants, and many more. Henrik's papers on subsurface scattering gave me the math to simulate translucent materials. A Rapid Hierarchical Technique for Translucent Materials had quite a bit of math, but it was good preparation for Subsurface Light Transport, which I actually implemented using the octree caching method proposed in the paper.
Volumetric scattering gives fog- or smoke-like effects, and, in this case, god-rays in water. For whatever reason Henrik doesn't have the paper (Efficient Simulation of Light Transport in Scenes with Participating Media using Photon Maps) on his site, but you can get it off of ACM. I actually found that most of the effects I wanted for this scene could be created using direct ray-marching, but I was able to make some nice volume caustics (see test images) using the technique.
A couple years ago Jerry Tessendorf gave a talk at SIGGRAPH about simulating ocean waves using the FFT. His slides are availiable from SIGGRAPH, but in my opinion Deep- Water Animation and Rendering does a decent job explaining it, as well as a few other water algorithms.
I wrote a small demo that implemented the above algorithm, and incorporated it into my renderer. The results were quite nice, and it really pays to have good looking water when you try to make underwater volume caustics (otherwise known as "god-rays").
I made use of two noise functions for different aspects of the scene. Perlin noise made a pretty good marble texture, and Worley "noise" (F2 - F1) made a good bump map and ground texture. I recommend Texturing and Modeling, a Procedural Approach as a good reference. I actually bought it.
While writing Hikaru, I had to make creating a scene, placing lights, geometry, etc., asimpler process than changing code and recompiling. So I used flex and bison (like lex and yacc) to make a scripting language. Here's a sample:
My initial renderings were of simple scenes, such as the above Cornell Box using irradiance caching + gradients. It only took several seconds to render the image. Irradiance caching works by assuming that the irradiance over a surface changes gradually, which is generally true for diffuse lambertian surfaces. The model tends to break down with caustics, or hard edges (requiring more samples), but for many scenes it works well. The algorithm uses an octree to store cache values in an oct about the size of the useful radius of the sample, which makes finding candidate samples fairly straightforward. The picture on the right is a visulalization of the irradiance cache. The red lines are translational gradients, and the green are rotational.
.ply
model had 14,027,872 vertices and 28,055,742 triangles in it, which was a bit much. In fact, I couldn't even load it on my machine. I tried a variety of programs to simplify the geometry, and finally settled on RSimp, a great little tool by Bentamin Watson and Dmitry Brodsky. Of course, simplifying the mesh required some compute power and memory, so I used my account at the San Diego Supercomputer Center on an Origin 2000. RSimp ran for 16 hours, using 4.5 GB of ram, but it eventually simplified the mesh down to something managable (~2 million triangles).
Subsurface scattering was my next hurdle. I implemented the hierarchial method proposed in Henrik's second paper, which stores irradiance values distributed evenly over the geometry of an object in an octree. Unlike the octree used for irradiance caching, the one for translucency actually stores representative values at the intermediary octs, meaning that if the error is low enough for a particular oct, it's irradiance can be used instead of searching its children.
That's a shout out to Rui Wang for his awesome rendering of a translucent dragon. In fact, his images were no small inspiration for my project. This one, of course, used direct visualization of the photon map, and wasn't really tweaked for perfection, but it was a good start. More translucent images to come.
This one took the most of my time. After reading the paper a few times I started to get the gist of the alorithm, and started to make some test images.
As you can see, it wasn't all smooth going. Eventually I figured out how to get stained glass to work, and how to sample the volume photon map correctly. Getting the phase function terms in the right place, as well as going through equations for scattered luminance took some time. Playing with scattering and absorbtion coefficients also became an ordeal. Eventually I started to combine techniques. The marble texture is procedural low-frequency fractal Perlin noise.
To really make good looking waves, I relized I had to do something more complicated. After reading several papers on waves, I settled on Tessendorf's Phillip's spectrum FFT waves. I ended up writing a little demo program to simulate waves.
Rendering good caustics proved to be a formidable task. Because my scene was becoming so large, it was requiring a large number of photons (> 1,000,000) to produce even somewhat decent caustics. I implemented a modified ray-marching scheme to help directly visualize caustics, which produced better results, without stochastic photon sampling. It works similar to shadow ray testing, except that it can refract through materials (like water), to produce correct caustics without hoping random photons will hit the right places. Importance sampling the waves wouldn't have helped much, since the entire area under the waves was rather important.
The seafloor texture is another Worley function mapping F2-F1, with a gradient noise applied to the sample point to remove jagginess, to color and bump. The floor mesh was generated using the above heightmap (made using "Render Clouds" in Photoshop, then a few Gaussian blurs to smooth it all out). The wave mesh (not visible) was generated using Tessendorf's FFT wave functions. Given a set of initial conditions, a wave height, a time, and a "choppiness" factor, they allow for rather realistic looking ocean waves. Since no particular time depends on any other, it can easily be used for animations (I didn't have time to make one though!).
I started adding things to the scene.
Here's a pair of images showing the difference translucency makes in the image. You can see the light bleeding through lucy from the rest of the scene and the god-rays in the background.
And once again, the final image.