Exercises

  1. Given a 1D volume density that is an arbitrary function of height f left-parenthesis h right-parenthesis , the optical distance between any two 3D points can be computed very efficiently if the integral integral Subscript 0 Superscript h prime Baseline f left-parenthesis h right-parenthesis normal d h is precomputed and stored in a table for a set of h prime values (Perlin 1985b; Max 1986). Work through the mathematics to show the derivation for this approach, and implement it in pbrt by implementing a new Medium that takes an arbitrary function or a 1D table of density values. Compare the efficiency and accuracy of this approach to the default implementation of Medium::Tr(), which uses Monte Carlo integration.
  2. The GridDensityMedium class uses a relatively large amount of memory for complex volume densities. Determine its memory requirements when used for the smoke images in this chapter, and modify its implementation to reduce memory use. One approach is to detect regions of space with constant (or relatively constant) density values using an octree data structure and to only refine the octree in regions where the densities are changing. Another possibility is to use less memory to record each density value, for example, by computing the minimum and maximum densities and then using 8 or 16 bits per density value to interpolate between them. What sorts of errors appear when either of these approaches is pushed too far?
  3. Implement a new Medium that computes the scattering density at points in the medium procedurally—for example, by using procedural noise functions like those discussed in Section 10.6. You may find useful inspiration for procedural volume modeling primitives in Wrenninge’s book (2012).
  4. A shortcoming of a fully-procedural Medium like the one in Exercise 11.3 can be the inefficiency of evaluating the medium’s procedural functions repeatedly. Add a caching layer to a procedural medium that, for example, maintains a set of small regular voxel grids over regions of space. When a density lookup is performed, first check the cache to see if a value can be interpolated from one of the grids; otherwise update the cache to store the density function over a region of space that includes the lookup point. Study how many cache entries (and how much memory is consequently required) are needed for good performance. How do the cache size requirements change with volumetric path tracing that only accounts for direct lighting versus full global illumination? How do you explain this difference?