## 14.5 Path Tracing

Now that we have derived the path integral form of the light transport equation, we’ll show how it can be used to derive the path-tracing light transport algorithm and will present a path-tracing integrator. Figure 14.17 compares images of a scene rendered with different numbers of pixel samples using the path-tracing integrator. In general, hundreds or thousands of samples per pixel may be necessary for high-quality results—potentially a substantial computational expense.

Path tracing was the first general-purpose unbiased Monte Carlo light transport algorithm used in graphics. Kajiya (1986) introduced it in the same paper that first described the light transport equation. Path tracing incrementally generates paths of scattering events starting at the camera and ending at light sources in the scene. One way to think of it is as an extension of Whitted’s method to include both delta distribution and nondelta BSDFs and light sources, rather than just accounting for the delta terms.

Although it is slightly easier to derive path tracing directly from the basic light transport equation, we will instead approach it from the path integral form, which helps build understanding of the path integral equation and will make the generalization to bidirectional path tracing (Section 16.3) easier to understand.

### 14.5.1 Overview

Given the path integral form of the LTE, we would like to estimate the value of the exitant radiance from the camera ray’s intersection point ,

for a given camera ray from that first intersects the scene at . We have two problems that must be solved in order to compute this estimate:

1. How do we estimate the value of the sum of the infinite number of terms with a finite amount of computation?
2. Given a particular term, how do we generate one or more paths in order to compute a Monte Carlo estimate of its multidimensional integral?

For path tracing, we can take advantage of the fact that for physically valid scenes, paths with more vertices scatter less light than paths with fewer vertices overall (this isn’t necessarily true for any particular pair of paths, just in the aggregate). This is a natural consequence of conservation of energy in BSDFs. Therefore, we will always estimate the first few terms and will then start to apply Russian roulette to stop sampling after a finite number of terms without introducing bias. Recall from Section 13.7 that Russian roulette allows us to probabilistically stop computing terms in a sum as long as we reweight the terms that are not terminated. For example, if we always computed estimates of , , and but stopped without computing more terms with probability , then an unbiased estimate of the sum would be

Using Russian roulette in this way doesn’t solve the problem of needing to evaluate an infinite sum but has pushed it a bit farther out.

If we take this idea a step further and instead randomly consider terminating evaluation of the sum at each term with probability ,

we will eventually stop continued evaluation of the sum. Yet, because for any particular value of there is greater than zero probability of evaluating the term and because it will be weighted appropriately if we do evaluate it, the final result is an unbiased estimate of the sum.

### 14.5.2 Path Sampling

Given this method for evaluating only a finite number of terms of the infinite sum, we also need a way to estimate the contribution of a particular term . We need vertices to specify the path, where the last vertex is on a light source and the first vertex is a point on the camera film or lens (Figure 14.18). Looking at the form of , a multiple integral over surface area of objects in the scene, the most natural thing to do is to sample vertices according to the surface area of objects in the scene, such that it’s equally probable to sample any particular point on an object in the scene for as any other point. (We don’t actually use this approach in the PathIntegrator implementation for reasons that will be described later, but this sampling technique could possibly be used to improve the efficiency of our basic implementation and helps to clarify the meaning of the path integral LTE.)

We could define a discrete probability over the objects in the scene. If each has surface area , then the probability of sampling a path vertex on the surface of the th object should be

Then, given a method to sample a point on the th object with uniform probability, the PDF for sampling any particular point on object is . Thus, the overall probability density for sampling the point is

And all samples have the same PDF value:

It’s reassuring that they all have the same weight, since our intent was to choose among all points on surfaces in the scene with equal probability.

Given the set of vertices sampled in this manner, we can then sample the last vertex on a light source in the scene, defining its PDF in the same way. Although we could use the same technique used for sampling path vertices to sample points on lights, this would lead to high variance, since for all of the paths where wasn’t on the surface of an emitter, the path would have zero value. The expected value would still be the correct value of the integral, but convergence would be extremely slow. A better approach is to sample over the areas of only the emitting objects with probabilities updated accordingly. Given a complete path, we have all of the information we need to compute the estimate of ; it’s just a matter of evaluating each of the terms.

It’s easy to be more creative about how we set the sampling probabilities with this general approach. For example, if we knew that indirect illumination from a few objects contributed to most of the lighting in the scene, we could assign a higher probability to generating path vertices on those objects, updating the sample weights appropriately.

However, there are two interrelated problems with sampling paths in this manner. The first can lead to high variance, while the second can lead to incorrect results. The first problem is that many of the paths will have no contribution if they have pairs of adjacent vertices that are not mutually visible. Consider applying this area sampling method in a complex building model: adjacent vertices in the path will almost always have a wall or two between them, giving no contribution for the path and high variance in the estimate.

The second problem is that if the integrand has delta functions in it (e.g., a point light source or a perfectly specular BSDF), this sampling technique will never be able to choose path vertices such that the delta distributions are nonzero. Even if there aren’t delta distributions, as the BSDFs become increasingly glossy almost all of the paths will have low contributions since the points in will cause the BSDF to have a small or zero value and again we will suffer from high variance. In a similar manner, small area light sources can also be sources of variance if not sampled explicitly.

### 14.5.3 Incremental Path Construction

A solution that solves both of these problems is to construct the path incrementally, starting from the vertex at the camera . At each vertex, the BSDF is sampled to generate a new direction; the next vertex is found by tracing a ray from in the sampled direction and finding the closest intersection. We are effectively trying to find a path with a large overall contribution by making a series of choices that find directions with important local contributions. While one can imagine situations where this approach could be ineffective, it is generally a good strategy.

Because this approach constructs the path by sampling BSDFs according to solid angle, and because the path integral LTE is an integral over surface area in the scene, we need to apply the correction to convert from the probability density according to solid angle to a density according to area (recall Section 5.5):

This correction causes all of the terms of the geometric term to cancel out of except for the term. Furthermore, we already know that and must be mutually visible since we traced a ray to find , so the visibility term is trivially equal to 1. An alternative way to think about this is that ray tracing provides an operation to importance sample the visibility component of . Therefore, if we use this sampling technique but we still sample the last vertex from some distribution over the surfaces of light sources , the value of the Monte Carlo estimate for a path is

### 14.5.4 Implementation

Our path-tracing implementation computes an estimate of the sum of path contributions using the approach described in the previous subsection. Starting at the first intersection of the camera ray with the scene geometry, , it incrementally samples path vertices by sampling from the BSDF’s sampling distribution at the current vertex and tracing a ray to the next vertex. To find the last vertex of a particular path, , which must be on a light source in the scene, it uses the multiple importance sampling–based direct lighting code that was developed for the direct lighting integrator. By using the multiple importance sampling weights instead of to compute the estimate as described earlier, we have lower variance in the result for cases where sampling the BSDF would have been a better way to find a point on the light.

Beyond how lights are sampled, another small difference is that as the estimates of the path contribution terms are being evaluated, the vertices of the previous path of length (everything except the vertex on the emitter) are reused as a starting point when constructing the path of length . This means that it is only necessary to trace one more ray to construct the new path, rather than  rays as we would if we started from scratch. Reusing paths in this manner does introduce correlation among all of the terms in the sum, which slightly reduces the quality of the result, although in practice this is more than made up for by the improved overall efficiency due to tracing fewer rays.

<<PathIntegrator Declarations>>=
class PathIntegrator : public SamplerIntegrator { public: <<PathIntegrator Public Methods>>
Spectrum Li(const RayDifferential &ray, const Scene &scene, Sampler &sampler, MemoryArena &arena, int depth) const; PathIntegrator(int maxDepth, std::shared_ptr<const Camera> camera, std::shared_ptr<Sampler> sampler) : SamplerIntegrator(camera, sampler), maxDepth(maxDepth) { }
private: <<PathIntegrator Private Data>>
const int maxDepth;
};

Although Russian roulette is used here to terminate path sampling in the manner described earlier, the integrator also supports a maximum depth. It can be set to a large value if only Russian roulette should be used to terminate paths.

<<PathIntegrator Public Methods>>=
PathIntegrator(int maxDepth, std::shared_ptr<const Camera> camera, std::shared_ptr<Sampler> sampler) : SamplerIntegrator(camera, sampler), maxDepth(maxDepth) { }

<<PathIntegrator Private Data>>=
const int maxDepth;

A number of variables record the current state of the path. beta holds the path throughput weight, which is defined as the factors of the throughput function —i.e., the product of the BSDF values and cosine terms for the vertices generated so far, divided by their respective sampling PDFs:

Thus, the product of beta with scattered light from direct lighting at the final vertex of the path gives the contribution for a path. (This quantity will reoccur many times in the following two chapters, and we will consistently refer to it as .) Because the effect of earlier path vertices is aggregated in this way, there is no need to store the positions and BSDFs of all of the vertices of the path, only the last one.

In the following implementation, L holds the radiance value from the running total of , ray holds the next ray to be traced to extend the path one more vertex, and specularBounce records if the last outgoing path direction sampled was due to specular reflection; the need to track this will be explained shortly.

<<PathIntegrator Method Definitions>>=
Spectrum PathIntegrator::Li(const RayDifferential &r, const Scene &scene, Sampler &sampler, MemoryArena &arena, int depth) const { Spectrum L(0.f), beta(1.f); RayDifferential ray(r); bool specularBounce = false; for (int bounces = 0; ; ++bounces) { <<Find next path vertex and accumulate contribution>>
<<Intersect ray with scene and store intersection in isect>>
SurfaceInteraction isect; bool foundIntersection = scene.Intersect(ray, &isect);
<<Possibly add emitted light at intersection>>
if (bounces == 0 || specularBounce) { <<Add emitted light at path vertex or from the environment>>
if (foundIntersection) L += beta * isect.Le(-ray.d); else for (const auto &light : scene.lights) L += beta * light->Le(ray);
}
<<Terminate path if ray escaped or maxDepth was reached>>
if (!foundIntersection || bounces >= maxDepth) break;
<<Compute scattering functions and skip over medium boundaries>>
isect.ComputeScatteringFunctions(ray, arena, true); if (!isect.bsdf) { ray = isect.SpawnRay(ray.d); bounces--; continue; }
<<Sample illumination from lights to find path contribution>>
L += beta * UniformSampleOneLight(isect, scene, arena, sampler);
<<Sample BSDF to get new path direction>>
Vector3f wo = -ray.d, wi; Float pdf; BxDFType flags; Spectrum f = isect.bsdf->Sample_f(wo, &wi, sampler.Get2D(), &pdf, BSDF_ALL, &flags); if (f.IsBlack() || pdf == 0.f) break; beta *= f * AbsDot(wi, isect.shading.n) / pdf; specularBounce = (flags & BSDF_SPECULAR) != 0; ray = isect.SpawnRay(wi);
<<Account for subsurface scattering, if applicable>>
if (isect.bssrdf && (flags & BSDF_TRANSMISSION)) { <<Importance sample the BSSRDF>>
SurfaceInteraction pi; Spectrum S = isect.bssrdf->Sample_S(scene, sampler.Get1D(), sampler.Get2D(), arena, &pi, &pdf); if (S.IsBlack() || pdf == 0) break; beta *= S / pdf;
<<Account for the direct subsurface scattering component>>
L += beta * UniformSampleOneLight(pi, scene, arena, sampler);
<<Account for the indirect subsurface scattering component>>
Spectrum f = pi.bsdf->Sample_f(pi.wo, &wi, sampler.Get2D(), &pdf, BSDF_ALL, &flags); if (f.IsBlack() || pdf == 0) break; beta *= f * AbsDot(wi, pi.shading.n) / pdf; specularBounce = (flags & BSDF_SPECULAR) != 0; ray = pi.SpawnRay(wi);
}
<<Possibly terminate the path with Russian roulette>>
if (bounces > 3) { Float q = std::max((Float).05, 1 - beta.y()); if (sampler.Get1D() < q) break; beta /= 1 - q; }
} return L; }

Each time through the for loop of the integrator, the next vertex of the path is found by intersecting the current ray with the scene geometry and computing the contribution of the path to the overall radiance value with the direct lighting code. A new direction is then chosen by sampling from the BSDF’s distribution at the last vertex of the path. After a few vertices have been sampled, Russian roulette is used to randomly terminate the path.

<<Find next path vertex and accumulate contribution>>=
<<Intersect ray with scene and store intersection in isect>>
SurfaceInteraction isect; bool foundIntersection = scene.Intersect(ray, &isect);
<<Possibly add emitted light at intersection>>
if (bounces == 0 || specularBounce) { <<Add emitted light at path vertex or from the environment>>
if (foundIntersection) L += beta * isect.Le(-ray.d); else for (const auto &light : scene.lights) L += beta * light->Le(ray);
}
<<Terminate path if ray escaped or maxDepth was reached>>
if (!foundIntersection || bounces >= maxDepth) break;
<<Compute scattering functions and skip over medium boundaries>>
isect.ComputeScatteringFunctions(ray, arena, true); if (!isect.bsdf) { ray = isect.SpawnRay(ray.d); bounces--; continue; }
<<Sample illumination from lights to find path contribution>>
L += beta * UniformSampleOneLight(isect, scene, arena, sampler);
<<Sample BSDF to get new path direction>>
Vector3f wo = -ray.d, wi; Float pdf; BxDFType flags; Spectrum f = isect.bsdf->Sample_f(wo, &wi, sampler.Get2D(), &pdf, BSDF_ALL, &flags); if (f.IsBlack() || pdf == 0.f) break; beta *= f * AbsDot(wi, isect.shading.n) / pdf; specularBounce = (flags & BSDF_SPECULAR) != 0; ray = isect.SpawnRay(wi);
<<Account for subsurface scattering, if applicable>>
if (isect.bssrdf && (flags & BSDF_TRANSMISSION)) { <<Importance sample the BSSRDF>>
SurfaceInteraction pi; Spectrum S = isect.bssrdf->Sample_S(scene, sampler.Get1D(), sampler.Get2D(), arena, &pi, &pdf); if (S.IsBlack() || pdf == 0) break; beta *= S / pdf;
<<Account for the direct subsurface scattering component>>
L += beta * UniformSampleOneLight(pi, scene, arena, sampler);
<<Account for the indirect subsurface scattering component>>
Spectrum f = pi.bsdf->Sample_f(pi.wo, &wi, sampler.Get2D(), &pdf, BSDF_ALL, &flags); if (f.IsBlack() || pdf == 0) break; beta *= f * AbsDot(wi, pi.shading.n) / pdf; specularBounce = (flags & BSDF_SPECULAR) != 0; ray = pi.SpawnRay(wi);
}
<<Possibly terminate the path with Russian roulette>>
if (bounces > 3) { Float q = std::max((Float).05, 1 - beta.y()); if (sampler.Get1D() < q) break; beta /= 1 - q; }

The first step in the loop is to find the next path vertex by intersecting ray against the scene geometry.

<<Intersect ray with scene and store intersection in isect>>=
SurfaceInteraction isect; bool foundIntersection = scene.Intersect(ray, &isect);

If the ray hits an object that is emissive, the emission is usually ignored, since the loop iteration at the previous path vertex performed a direct illumination estimate that already accounted for its effect. The same is true when a ray escapes into an emissive environment. However, there are two exceptions: the first is at the initial intersection point of camera rays, since this is the only opportunity to include emission from directly visible objects. The second is when the sampled direction from the last path vertex was from a specular BSDF component: in this case, the previous iteration’s direct illumination estimate could not evaluate the associated integrand containing a Dirac delta function, and we must account for it here.

<<Possibly add emitted light at intersection>>=
if (bounces == 0 || specularBounce) { <<Add emitted light at path vertex or from the environment>>
if (foundIntersection) L += beta * isect.Le(-ray.d); else for (const auto &light : scene.lights) L += beta * light->Le(ray);
}

When no intersection is found, the ray has escaped the scene and thus the path sampling iteration terminates. Similarly, the iteration terminates when bounces exceeds the prescribed maximum value.

<<Terminate path if ray escaped or maxDepth was reached>>=
if (!foundIntersection || bounces >= maxDepth) break;

When emitted light should be included, the path throughput weight must be multiplied with the radiance emitted by the current path vertex (if an intersection was found) or radiance emitted by infinite area light sources, if present.

<<Add emitted light at path vertex or from the environment>>=
if (foundIntersection) L += beta * isect.Le(-ray.d); else for (const auto &light : scene.lights) L += beta * light->Le(ray);

Before estimating the direct illumination at the current vertex, it is necessary to compute the scattering functions at the vertex. A special case arises when SurfaceInteraction::bsdf is equal to nullptr, which indicates that the current surface has no effect on light. pbrt uses such surfaces to represent transitions between participating media, whose boundaries are themselves optically inactive (i.e., they have the same index of refraction on both sides). Since the basic PathIntegrator ignores media, it simply skips over such surfaces without counting them as scattering events in the bounces counter.

<<Compute scattering functions and skip over medium boundaries>>=
isect.ComputeScatteringFunctions(ray, arena, true); if (!isect.bsdf) { ray = isect.SpawnRay(ray.d); bounces--; continue; }

The direct lighting computation uses the UniformSampleOneLight() function, which gives an estimate of the exitant radiance from direct lighting at the vertex at the end of the current path. Scaling this value by the path throughput weight gives its overall contribution to the total radiance estimate.

<<Sample illumination from lights to find path contribution>>=
L += beta * UniformSampleOneLight(isect, scene, arena, sampler);

Now it is necessary to sample the BSDF at the vertex at the end of the current path to get an outgoing direction for the next ray to trace. The integrator updates the path throughput weight as described earlier and initializes ray with the ray to be traced to find the next vertex in the next iteration of the for loop.

<<Sample BSDF to get new path direction>>=
Vector3f wo = -ray.d, wi; Float pdf; BxDFType flags; Spectrum f = isect.bsdf->Sample_f(wo, &wi, sampler.Get2D(), &pdf, BSDF_ALL, &flags); if (f.IsBlack() || pdf == 0.f) break; beta *= f * AbsDot(wi, isect.shading.n) / pdf; specularBounce = (flags & BSDF_SPECULAR) != 0; ray = isect.SpawnRay(wi);

The case where a ray refracts into a material with a BSSRDF is handled specially in the fragment <<Account for subsurface scattering, if applicable>>, which is implemented in Section 15.4.3 after subsurface scattering has been discussed in more detail.

Path termination kicks in after a few bounces, with termination probability set based on the path throughput weight. In general, it’s worth having a higher probability of terminating low-contributing paths, since they have relatively less impact on the final image. (A minimum termination probability ensures termination is possible if beta is large; for example, due to a large BSDF value divided by a low sampling probability.) If the path isn’t terminated, beta is updated with the Russian roulette weight and all subsequent terms will be appropriately affected by it.

<<Possibly terminate the path with Russian roulette>>=
if (bounces > 3) { Float q = std::max((Float).05, 1 - beta.y()); if (sampler.Get1D() < q) break; beta /= 1 - q; }