## 9.3 Bump Mapping

All of the `Material`s defined in the previous section take an optional
floating-point texture that defines a displacement at each point on the
surface: each point has a displaced point associated with
it, defined by , where
is the offset returned by the displacement texture at and
is the surface normal at
(Figure 9.6). We would like to use this texture to
compute shading normals so that the surface appears as if it actually had
been offset by the displacement function, without modifying its geometry.
This process is called *bump mapping*. For relatively small
displacement functions, the visual effect of bump mapping can be quite
convincing. This idea and the specific technique to compute these shading
normals in a way that gives a plausible appearance of the actual displaced
surface were developed by Blinn (1978).

`pbrt`doesn’t compute a geometric representation of this displaced surface, but instead uses it to compute shading normals for bump mapping.

Figure 9.7 shows the effect of applying bump mapping defined by an image map of a grid of lines to a sphere.

A more complex example is shown in Figure 9.8, which shows a scene rendered with and without bump mapping. There, the bump map gives the appearance of a substantial amount of detail in the walls and floors that isn’t actually present in the geometric model.

Figure 9.9 shows one of the image maps used to define the bump function in Figure 9.8.

The `Material::Bump()` method is a utility routine for use by
`Material` implementations. It is responsible for computing the effect
of bump mapping at the point being shaded given a particular displacement
`Texture`. So that future `Material` implementations aren’t
required to support bump mapping with this particular mechanism (or at
all), we’ve placed this method outside of the hard-coded material
evaluation pipeline and left it as a function that particular material
implementations can call on their own.

The implementation of `Material::Bump()` is based on finding an
approximation to the partial derivatives and of the displaced
surface and using them in place of the surface’s actual partial derivatives to
compute the shading normal. (Recall that the surface normal is given by the
cross product of these vectors, .) Assume that the
original surface is defined by a parametric function , and the bump
offset function is a scalar function . Then the displaced surface is
given by

where is the surface normal at .

The partial derivatives of this function can be found using the chain rule. For example, the partial derivative in is

We already have computed the value of ; it’s
and is available in the `SurfaceInteraction` structure, which
also stores the surface normal and the partial derivative
. The displacement function
can be evaluated as needed, which leaves as the only remaining term.

There are two possible approaches to finding the values of
and . One option
would be to augment the `Texture` interface with a method to compute
partial derivatives of the underlying texture function. For example, for
image map textures mapped to the surface directly using its
parameterization, these partial derivatives can be computed by subtracting
adjacent texels in the and directions. However, this approach is
difficult to extend to complex procedural textures like some of the ones
defined in Chapter 10. Therefore, `pbrt` directly computes
these values with forward differencing in the `Material::Bump()` method,
without modifying the `Texture` interface.

Recall the definition of the partial derivative:

Forward differencing approximates the value using a finite value of and evaluating at two positions. Thus, the final expression for is the following (for simplicity, we have dropped the explicit dependence on for some of the terms):

Interestingly enough, most bump-mapping implementations ignore the final
term under the assumption that is expected to be relatively small.
(Since bump mapping is mostly useful for approximating small perturbations,
this is a reasonable assumption.) The fact that many renderers do not
compute the values and may also have something to do with
this simplification. An implication of ignoring the last term is that the
magnitude of the displacement function then does not affect the bump-mapped
partial derivatives; adding a constant value to it globally doesn’t affect
the final result, since only differences of the bump function affect it.
`pbrt` computes all three terms since it has and readily
available, although in practice this final term rarely makes a visually
noticeable difference.

One important detail in the definition of `Bump()` is that the
`d` parameter is declared to be of type `const
shared_ptr<Texture<Float>> &`, rather than, for example,
`shared_ptr<Texture<Float>>`. This difference is very important for
performance, but the reason is subtle. If a C++ reference was not used
here, then the `shared_ptr` implementation would need to increment
the reference count for the temporary value passed to the method, and the
reference count would need to be decremented when the method returned.
This is an efficient operation with serial code, but with multiple threads
of execution, it leads to a situation where multiple processing cores end
up modifying the same memory location whenever different rendering tasks
run this method with the same displacement texture. This state of affairs
in turn leads to the expensive “read for ownership” operation described
in Section A.6.1.

`siEval`

`du`in the direction>>

`siEval`

`dv`in the direction>>

`siEval`

`du`in the direction>>

`siEval`

`dv`in the direction>>

One remaining issue is how to choose the offsets and
for the finite differencing computations. They should be small enough that
fine changes in are captured but large enough so that available
floating-point precision is sufficient to give a good result. Here, we will
choose and values that lead to an offset that is
about half the image space pixel sample spacing and use them to update the
appropriate member variables in the `SurfaceInteraction` to reflect a
shift to the offset position. (See Section 10.1.1 for an explanation of how
the image space distances are computed.)

Another detail to note in the following code: we recompute the surface
normal as the cross product of and rather than using
`si->shading.n` directly. The reason for this is that the orientation
of may have been flipped (recall the fragment <<Adjust
normal based on orientation and handedness>> in
Section 2.10.1). However, we need the original
normal here. Later, when the results of the computation are passed to
`SurfaceInteraction::SetShadingGeometry()`, the normal we compute
will itself be flipped if necessary.

`siEval`

`du`in the direction>>=

The <<Shift `siEval` `dv` in the direction>> fragment
is nearly the same as the fragment that shifts `du`, so it isn’t included
here.

Given the new positions and the displacement texture’s values at them, the partial derivatives can be computed directly using Equation (9.1):