Slim, Part II (Global illumination or 'GI')

What is GI?

It is the opposite of local illumination.. To do GI, a shading sample needs information not just about itself but also of its surroundings (hence 'global' illumination).

GI is an umbrella term for a variety of phenomena. Six of them are presented below (raytracing, radiosity, ambient occlusion, image-based illumination, subsurface scattering, caustics ).

GI hugely bumps up the level of photorealism achievable via rendering, hence our interest in it.

Here are the "new" calls that were added to RSL to support GI:

In addition, shadow() and environment() are also helpful in writing GI shaders.

Here are descriptions (from Pixar's docs) on the calls. Also, this page contains scans of recent years' PRMan brochures (which contain useful info. related to GI and other features, and other Pixar/RenderMan trivia).


Ways to raytrace:

This is a simple scene to demo raytracing, and this is yet another.

Here is yet another example: here is the Maya scene, which produces this render:

This is the shader network for the above:


The easiest way to obtain "radiosity" (which is a heat-transfer term) is with a shader containing the indirectdiffuse() call, as with '' below. This call evaluates shading on surfaces adjacent to the current (our) shading sample, to bleed those colors on to our surface. For cases when there are no surfaces nearby, color is looked up in an environment map (eg. using 'wh.tex' below). You can add this effect as a layer, as is done in '' where it is combined with regular diffuse shading.

A particularly nice template to attach on to adjacent surfaces is the 'ColorCells' one..

Open the following scene, then import the .slo shader into Slim and attach it to the ground plane. You also need the 'wh.tex' texture map in C:/temp.

Scene [study this shader, it uses the indirectdiffuse() RSL call mentioned earlier]

Another way to get indirect diffuse illumination: use an AdditiveFX over a receiving surface, add a 'Constant' appearance as the base shader, and for 'ch1', add an 'IndirectDiffuse' template.

Ambient occlusion

The following scene shows the use of the 'Occlusion' template, which determines how much (fraction) a shading point is visible to the environment. Using this info. to shade gives us a certain realism that is very familiar - it mimics the look of (sun)light not being able to enter nooks and crannies, underside of overhangs, eaves, etc.

'Real world ambient occlusion':

Here is a simple scene that calculates ambient occlusion, using the built-in (Slim's) AmbientOcclusion template. Note that the occlusion template feeds its value to a 'ColorSplines' appearance, which in turn feeds a 'Constant' appearance: Occlusion -> ColorSplines -> Constant. This is for creative control - the ramp in the ColorSplines appearance can be set to all sorts of patterns (eg. b/w stripes) to produce pretty outputs.

You can also render ambient occlusion using your own surface shader (compiled .slo attached to surfaces), such as this one by Bob Moyer (Texas A&M):

surface amb_occ (float samples = 1; color Camb = 1) {

     float sum = 0;
     normal Nn = normalize(N);
     vector R = reflect(normalize(I), Nn);

     gather ("illuminance", P,R, PI/2, samples, "distribution", "cosine") {
     else { sum = sum + 1; }
     sum = sum / samples;

     Ci = Camb * sum;
     Oi = Os;


In the above, the gather() call does distributed ray-tracing to count ray 'misses' (fraction of rays that are able to "escape" the scene by not hitting anything). The ray miss fraction variable 'sum' is directly turned into an occlusion multiplier. As an alternative, you can also use the built-in occlusion() call which encapsulates the above calculation.


The idea behind image-based-illumination (IBI) is that illumination contained in images is used as light sources to light CG scenes. The easiest way to experiment with IBI is to use the Slim IBI appearance..

Here is a sample scene containing an IBI appearance, and this is the rendered result (why does this look so 'real'? Answer - the sharp bright highlights are shaped, they are not all plain circles!):

The env. map used is the following (Grace Cathedral in San Francisco), and comes from Paul Debevec's light probes gallery.

Here is Paul's 'grace_probe' HDR image (.hdr file format) which is pictured above (and used to produce the render shown above), and here is the txmake-converted .tex version used in the scene linked above. The .tex file is huge - over 11MB - instead of downloading the .tex file it might be quicker for you (and more instructive) to get the .hdr image instead and do the .tex conversion yourself.

Here is how to convert .hdr images into PRMan's .tex format (easiest to use a 'cygwin' shell to do this), in order to be read into an IBI appearance - make sure to add the '-format tiff' specifier:

txmake -format tiff -envlatl -float hdr_image.hdr hdr_image.tex

Additional things to try with the above scene:

This is another IBI scene and here is the tiff file (not a .hdr one!) to use with it after converting it to .tex.

Steps to light using an HDR image, using an 'Environment' light appearance instead of 'IBI':

If you used a Plastic template, hook up to its Color attribute a pattern (such as Oak Grain), texture node, etc. You can get very beautiful and realistic renders by layering an IBL component to an existing surface.

Additionally, you can create a coord system (axis) in Maya (MTOR), rotate/scale/translate it, and enter the name (of the SHAPE node of the coord sys you created) in the 'Coordinate System' field of the Environment template. Doing so lets you control the coordinate system for the IBL .hdr "light" you are using to shade.

You can also try doing IBL using RfM (RenderMan for Maya):

Subsurface scattering

If you can't get the foll. scene to work, play with the subsurf_adaptive shader settings and observe the effects on the Slim 'shader ball'.

Here is a simple scene with this shader applied.

These are some rendered results:

Here subsurface scattering is turned off using Ksub=0.0.

Ksub = 2 now..

Ksub = 3.

This is another SSS scene..

You can easily do SSS using RfM. In any Maya material, do 'Attributes -> RenderMan -> Add Subsurface Scattering'. Then look at the bottom of the AE attributes pane, for 'Extra RenderMan Attributes' - adjust values, render :)


If photons came out streaming from light sources (like small hard balls), got reflected/refracted by objects in the scene and eventually settled down, what would their spatial distribution look like? That is the calculation made in order to generate a 'photon map'. Once a photon map is created, regular rendering (from scene cam) can occur, where the photon is used (read in) to create pools of light known as caustics.

A couple of real world caustics ("transmission caustics" first, foll. by "reflection caustics"):

Here's how you can create reflection caustics (similar steps for transmission; also, turn raytracing 'on' first):

Here is a scene created using the steps above. The Slim nodes look like this:

To get the exact effect you want, you can tweak the caustics multiplier and light intensity (mtorSpotLight), number of photons emitted and estimator (PhotonMap), shading rate (RenderMan globals) and spotlight placement (Maya). You can adjust add'l values in Ensemble and Deluxe. True that this isn't a pushbutton solution, on the flip side it gives you a lot of flexibility.

This is Pixar's photon map Slim template, FYI (note that it uses the photonmap() RSL call). Here are four older scenes (might not work - OK if they don't - just follow the steps above to make your own):

Finally, here is a nice way to create transmission caustics for underwater (pool) reflections.

What's next?

Good luck, HAVE FUN!