lucy in the deep

Photon Mapping, Subsurface Scattering (translucency), Participating Media/Volumetric Scattering (fog-like effects), Phillips Spectrum FFT Waves (water), Perlin and Worley-based Noise

Craig Donner
cdonner@graphics.ucsd.edu

Implementation

For this project I wrote a ray-tracer from scratch, which I humbly named "Hikaru", which means "ray" or "light." No LRT or Miro code here.... It's all mine. Hikaru managed to grow from a somple ray tracer to a full global illumination renderer, all in the course of 10 weeks. It also makes a kick-ass bounding-volume hierarchy ^_^, and a BSP tree as well. There were a few optimization tweaks (fewer shadow rays the further from the camera you are, shorter step-sizes for rays that trace deeper into the scene, etc.). The final image was rendered using a AABB hierarchy, with over 500,000 triangles. It took about 5 hours to render on a P4 2.4 GHz using around 250 MB of ram.

This project somehow managed to involve the following:

The Road to Final Rendering

Irradiance Caching/Gradients

My initial renderings were of simple scenes, such as the above Cornell Box using irradiance caching + gradients. It only took several seconds to render the image. Irradiance caching works by assuming that the irradiance over a surface changes gradually, which is generally true for diffuse lambertian surfaces. The model tends to break down with caustics, or hard edges (requiring more samples), but for many scenes it works well. The algorithm uses an octree to store cache values in an oct about the size of the useful radius of the sample, which makes finding candidate samples fairly straightforward. The picture on the right is a visulalization of the irradiance cache. The red lines are translational gradients, and the green are rotational.


Lucy

The model of Lucy I used is from The Stanford 3D Scanning Repository. The original .ply model had 14,027,872 vertices and 28,055,742 triangles in it, which was a bit much. In fact, I couldn't even load it on my machine. I tried a variety of programs to simplify the geometry, and finally settled on RSimp, a great little tool by Bentamin Watson and Dmitry Brodsky. Of course, simplifying the mesh required some compute power and memory, so I used my account at the San Diego Supercomputer Center on an Origin 2000. RSimp ran for 16 hours, using 4.5 GB of ram, but it eventually simplified the mesh down to something managable (~2 million triangles).

Translucency

Subsurface scattering was my next hurdle. I implemented the hierarchial method proposed in Henrik's second paper, which stores irradiance values distributed evenly over the geometry of an object in an octree. Unlike the octree used for irradiance caching, the one for translucency actually stores representative values at the intermediary octs, meaning that if the error is low enough for a particular oct, it's irradiance can be used instead of searching its children.

That's a shout out to Rui Wang for his awesome rendering of a translucent dragon. In fact, his images were no small inspiration for my project. This one, of course, used direct visualization of the photon map, and wasn't really tweaked for perfection, but it was a good start. More translucent images to come.

Volumetric Scattering

This one took the most of my time. After reading the paper a few times I started to get the gist of the alorithm, and started to make some test images.

As you can see, it wasn't all smooth going. Eventually I figured out how to get stained glass to work, and how to sample the volume photon map correctly. Getting the phase function terms in the right place, as well as going through equations for scattered luminance took some time. Playing with scattering and absorbtion coefficients also became an ordeal. Eventually I started to combine techniques. The marble texture is procedural low-frequency fractal Perlin noise.

Underwater

At this point, while trying to think of a scene, I decided that an underwater scene would make the best use of most of the algorithms I had implemented. But to make an underwater scene, I first needed water. Turbulence functions produced rather repeatable patterns, and were difficult to control.

To really make good looking waves, I relized I had to do something more complicated. After reading several papers on waves, I settled on Tessendorf's Phillip's spectrum FFT waves. I ended up writing a little demo program to simulate waves.

Rendering good caustics proved to be a formidable task. Because my scene was becoming so large, it was requiring a large number of photons (> 1,000,000) to produce even somewhat decent caustics. I implemented a modified ray-marching scheme to help directly visualize caustics, which produced better results, without stochastic photon sampling. It works similar to shadow ray testing, except that it can refract through materials (like water), to produce correct caustics without hoping random photons will hit the right places. Importance sampling the waves wouldn't have helped much, since the entire area under the waves was rather important.


The seafloor texture is another Worley function mapping F2-F1, with a gradient noise applied to the sample point to remove jagginess, to color and bump. The floor mesh was generated using the above heightmap (made using "Render Clouds" in Photoshop, then a few Gaussian blurs to smooth it all out). The wave mesh (not visible) was generated using Tessendorf's FFT wave functions. Given a set of initial conditions, a wave height, a time, and a "choppiness" factor, they allow for rather realistic looking ocean waves. Since no particular time depends on any other, it can easily be used for animations (I didn't have time to make one though!).

I started adding things to the scene.

Here's a pair of images showing the difference translucency makes in the image. You can see the light bleeding through lucy from the rest of the scene and the god-rays in the background.

And once again, the final image.