10.2 Texture Coordinate Generation

Almost all of the textures in this chapter are functions that take a 2D or 3D coordinate and return a texture value. Sometimes there are obvious ways to choose these texture coordinates; for parametric surfaces, such as the quadrics in Chapter 3, there is a natural 2D left-parenthesis u comma v right-parenthesis parameterization of the surface, and for all surfaces the shading point normal p Subscript is a natural choice for a 3D coordinate.

In other cases, there is no natural parameterization, or the natural parameterization may be undesirable. For instance, the left-parenthesis u comma v right-parenthesis values near the poles of spheres are severely distorted. Also, for an arbitrary subdivision surface, there is no simple, general-purpose way to assign texture values so that the entire left-bracket 0 comma 1 right-bracket squared space is covered continuously and without distortion. In fact, creating smooth parameterizations of complex meshes with low distortion is an active area of research in computer graphics.

This section starts by introducing two abstract base classes—TextureMapping2D and TextureMapping3D—that provide an interface for computing these 2D and 3D texture coordinates. We will then implement a number of standard mappings using this interface (Figure 10.7 shows a number of them).

Figure 10.7: A checkerboard texture, applied to a hyperboloid with different texture coordinate generation techniques. From left to right, left-parenthesis u comma v right-parenthesis mapping, spherical mapping, cylindrical mapping, and planar mapping.

Texture implementations store a pointer to a 2D or 3D mapping function as appropriate and use it to compute the texture coordinates at each point. Thus, it’s easy to add new mappings to the system without having to modify all of the Texture implementations, and different mappings can be used for different textures associated with the same surface. In pbrt, we will use the convention that 2D texture coordinates are denoted by left-parenthesis s comma t right-parenthesis ; this helps make clear the distinction between the intrinsic left-parenthesis u comma v right-parenthesis parameterization of the underlying surface and the (possibly different) coordinate values used for texturing.

The TextureMapping2D base class has a single method, TextureMapping2D::Map(), which is given the SurfaceInteraction at the shading point and returns the left-parenthesis s comma t right-parenthesis texture coordinates via a Point2f. It also returns estimates for the change in left-parenthesis s comma t right-parenthesis with respect to pixel x and y coordinates in the dstdx and dstdy parameters so that textures that use the mapping can determine the left-parenthesis s comma t right-parenthesis sampling rate and filter accordingly.

<<Texture Declarations>>= 
class TextureMapping2D { public: <<TextureMapping2D Interface>> 
virtual ~TextureMapping2D() { } virtual Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const = 0;
};

<<TextureMapping2D Interface>>= 
virtual Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const = 0;

10.2.1 2D left-parenthesis u comma v right-parenthesis Mapping

The simplest mapping uses the left-parenthesis u comma v right-parenthesis coordinates in the SurfaceInteraction to compute the texture coordinates. Their values can be offset and scaled with user-supplied values in each dimension.

<<Texture Declarations>>+=  
class UVMapping2D : public TextureMapping2D { public: <<UVMapping2D Public Methods>> 
UVMapping2D(Float su = 1, Float sv = 1, Float du = 0, Float dv = 0); Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const;
private: const Float su, sv, du, dv; };

<<Texture Method Definitions>>= 
UVMapping2D::UVMapping2D(Float su, Float sv, Float du, Float dv) : su(su), sv(sv), du(du), dv(dv) { }

The scale-and-shift computation to compute left-parenthesis s comma t right-parenthesis coordinates is straightforward:

<<Texture Method Definitions>>+=  
Point2f UVMapping2D::Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const { <<Compute texture differentials for 2D left-parenthesis u comma v right-parenthesis mapping>> 
*dstdx = Vector2f(su * si.dudx, sv * si.dvdx); *dstdy = Vector2f(su * si.dudy, sv * si.dvdy);
return Point2f(su * si.uv[0] + du, sv * si.uv[1] + dv); }

Computing the differential change in s and t in terms of the original change in u and v and the scale amounts is also easy. Using the chain rule,

StartFraction partial-differential s Over partial-differential x EndFraction equals StartFraction partial-differential u Over partial-differential x EndFraction StartFraction partial-differential s Over partial-differential u EndFraction plus StartFraction partial-differential v Over partial-differential x EndFraction StartFraction partial-differential s Over partial-differential v EndFraction

and similarly for the three other partial derivatives. From the mapping method,

s equals s Subscript u Baseline u plus d Subscript u Baseline comma

so

StartFraction partial-differential s Over partial-differential u EndFraction equals s Subscript u Baseline comma StartFraction partial-differential s Over partial-differential v EndFraction equals 0 comma

and thus

StartFraction partial-differential s Over partial-differential x EndFraction equals s Subscript u Baseline StartFraction partial-differential u Over partial-differential x EndFraction comma

and so forth.

<<Compute texture differentials for 2D left-parenthesis u comma v right-parenthesis mapping>>= 
*dstdx = Vector2f(su * si.dudx, sv * si.dvdx); *dstdy = Vector2f(su * si.dudy, sv * si.dvdy);

10.2.2 Spherical Mapping

Another useful mapping effectively wraps a sphere around the object. Each point is projected along the vector from the sphere’s center through the point, up to the sphere’s surface. There, the left-parenthesis u comma v right-parenthesis mapping for the sphere shape is used. The SphericalMapping2D stores a transformation that is applied to points before this mapping is performed; this effectively allows the mapping sphere to be arbitrarily positioned and oriented with respect to the object.

<<Texture Declarations>>+=  
class SphericalMapping2D : public TextureMapping2D { public: <<SphericalMapping2D Public Methods>> 
SphericalMapping2D(const Transform &WorldToTexture) : WorldToTexture(WorldToTexture) { } Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const;
private: Point2f sphere(const Point3f &p) const; const Transform WorldToTexture; };

<<Texture Method Definitions>>+=  
Point2f SphericalMapping2D::Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const { Point2f st = sphere(si.p); <<Compute texture coordinate differentials for sphere left-parenthesis u comma v right-parenthesis mapping>> 
const Float delta = .1f; Point2f stDeltaX = sphere(si.p + delta * si.dpdx); *dstdx = (stDeltaX - st) / delta; Point2f stDeltaY = sphere(si.p + delta * si.dpdy); *dstdy = (stDeltaY - st) / delta;
<<Handle sphere mapping discontinuity for coordinate differentials>> 
if ((*dstdx)[1] > .5) (*dstdx)[1] = 1 - (*dstdx)[1]; else if ((*dstdx)[1] < -.5f) (*dstdx)[1] = -((*dstdx)[1] + 1); if ((*dstdy)[1] > .5) (*dstdy)[1] = 1 - (*dstdy)[1]; else if ((*dstdy)[1] < -.5f) (*dstdy)[1] = -((*dstdy)[1] + 1);
return st; }

A short utility function computes the mapping for a single point. It will be useful to have this logic separated out for computing texture coordinate differentials.

<<Texture Method Definitions>>+=  
Point2f SphericalMapping2D::sphere(const Point3f &p) const { Vector3f vec = Normalize(WorldToTexture(p) - Point3f(0,0,0)); Float theta = SphericalTheta(vec), phi = SphericalPhi(vec); return Point2f(theta * InvPi, phi * Inv2Pi); }

We could use the chain rule again to compute the texture coordinate differentials but will instead use a forward differencing approximation to demonstrate another way to compute these values that is useful for more complex mapping functions. Recall that the SurfaceInteraction stores the screen space partial derivatives partial-differential normal p slash partial-differential x and partial-differential normal p slash partial-differential y that give the change in position as a function of change in image sample position. Therefore, if the s coordinate is computed by some function f Subscript s Baseline left-parenthesis normal p Subscript Baseline right-parenthesis , it’s easy to compute approximations like

StartFraction partial-differential s Over partial-differential x EndFraction almost-equals StartFraction f Subscript s Baseline left-parenthesis normal p Subscript Baseline plus normal upper Delta partial-differential normal p slash partial-differential x right-parenthesis minus f Subscript s Baseline left-parenthesis normal p Subscript Baseline right-parenthesis Over normal upper Delta EndFraction period

As the distance normal upper Delta approaches 0 , this gives the actual partial derivative at  normal p Subscript .

<<Compute texture coordinate differentials for sphere left-parenthesis u comma v right-parenthesis mapping>>= 
const Float delta = .1f; Point2f stDeltaX = sphere(si.p + delta * si.dpdx); *dstdx = (stDeltaX - st) / delta; Point2f stDeltaY = sphere(si.p + delta * si.dpdy); *dstdy = (stDeltaY - st) / delta;

One other detail is that the sphere mapping has a discontinuity in the mapping formula; there is a seam at t equals 1 , where the t texture coordinate discontinuously jumps back to zero. We can detect this case by checking to see if the absolute value of the estimated derivative computed with forward differencing is greater than 0.5 and then adjusting it appropriately.

<<Handle sphere mapping discontinuity for coordinate differentials>>= 
if ((*dstdx)[1] > .5) (*dstdx)[1] = 1 - (*dstdx)[1]; else if ((*dstdx)[1] < -.5f) (*dstdx)[1] = -((*dstdx)[1] + 1); if ((*dstdy)[1] > .5) (*dstdy)[1] = 1 - (*dstdy)[1]; else if ((*dstdy)[1] < -.5f) (*dstdy)[1] = -((*dstdy)[1] + 1);

10.2.3 Cylindrical Mapping

The cylindrical mapping effectively wraps a cylinder around the object. It also supports a transformation to orient the mapping cylinder.

<<Texture Declarations>>+=  
class CylindricalMapping2D : public TextureMapping2D { public: <<CylindricalMapping2D Public Methods>> 
CylindricalMapping2D(const Transform &WorldToTexture) : WorldToTexture(WorldToTexture) { } Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const;
private: <<CylindricalMapping2D Private Methods>> 
Point2f cylinder(const Point3f &p) const { Vector3f vec = Normalize(WorldToTexture(p) - Point3f(0,0,0)); return Point2f((Pi + std::atan2(vec.y, vec.x)) * Inv2Pi, vec.z); }
const Transform WorldToTexture; };

The cylindrical mapping has the same basic structure as the sphere mapping; just the mapping function is different. Therefore, we will omit the fragment that computes texture coordinate differentials, since it is essentially the same as the spherical version.

<<Texture Method Definitions>>+=  
Point2f CylindricalMapping2D::Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const { Point2f st = cylinder(si.p); <<Compute texture coordinate differentials for cylinder left-parenthesis u comma v right-parenthesis mapping>> 
const Float delta = .01f; Point2f stDeltaX = cylinder(si.p + delta * si.dpdx); *dstdx = (stDeltaX - st) / delta; if ((*dstdx)[1] > .5) (*dstdx)[1] = 1.f - (*dstdx)[1]; else if ((*dstdx)[1] < -.5f) (*dstdx)[1] = -((*dstdx)[1] + 1); Point2f stDeltaY = cylinder(si.p + delta * si.dpdy); *dstdy = (stDeltaY - st) / delta; if ((*dstdy)[1] > .5) (*dstdy)[1] = 1.f - (*dstdy)[1]; else if ((*dstdy)[1] < -.5f) (*dstdy)[1] = -((*dstdy)[1] + 1);
return st; }

<<CylindricalMapping2D Private Methods>>= 
Point2f cylinder(const Point3f &p) const { Vector3f vec = Normalize(WorldToTexture(p) - Point3f(0,0,0)); return Point2f((Pi + std::atan2(vec.y, vec.x)) * Inv2Pi, vec.z); }

10.2.4 Planar Mapping

Another classic mapping method is planar mapping. The point is effectively projected onto a plane; a 2D parameterization of the plane then gives texture coordinates for the point. For example, a point normal p Subscript might be projected onto the z equals 0 plane to yield texture coordinates given by s equals normal p Subscript Baseline Subscript x and t equals normal p Subscript Baseline Subscript y .

In general, we can define such a parameterized plane with two nonparallel vectors bold v Subscript s and bold v Subscript t and offsets d Subscript s and d Subscript t . The texture coordinates are given by the coordinates of the point with respect to the plane’s coordinate system, which are computed by taking the dot product of the vector from the point to the origin with each vector bold v Subscript s and bold v Subscript t and then adding the offset. For the example in the previous paragraph, we’d have bold v Subscript s Baseline equals left-parenthesis 1 comma 0 comma 0 right-parenthesis , bold v Subscript t Baseline equals left-parenthesis 0 comma 1 comma 0 right-parenthesis , and d Subscript s Baseline equals d Subscript t Baseline equals 0 .

<<Texture Declarations>>+=  
class PlanarMapping2D : public TextureMapping2D { public: <<PlanarMapping2D Public Methods>> 
Point2f Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const; PlanarMapping2D(const Vector3f &vs, const Vector3f &vt, Float ds = 0, Float dt = 0) : vs(vs), vt(vt), ds(ds), dt(dt) { }
private: const Vector3f vs, vt; const Float ds, dt; };

<<PlanarMapping2D Public Methods>>= 
PlanarMapping2D(const Vector3f &vs, const Vector3f &vt, Float ds = 0, Float dt = 0) : vs(vs), vt(vt), ds(ds), dt(dt) { }

The planar mapping differentials can be computed directly by finding the differentials of the point normal p Subscript in texture coordinate space.

<<Texture Method Definitions>>+=  
Point2f PlanarMapping2D::Map(const SurfaceInteraction &si, Vector2f *dstdx, Vector2f *dstdy) const { Vector3f vec(si.p); *dstdx = Vector2f(Dot(si.dpdx, vs), Dot(si.dpdx, vt)); *dstdy = Vector2f(Dot(si.dpdy, vs), Dot(si.dpdy, vt)); return Point2f(ds + Dot(vec, vs), dt + Dot(vec, vt)); }

10.2.5 3D Mapping

We will also define a TextureMapping3D class that defines the interface for generating 3D texture coordinates.

<<Texture Declarations>>+=  
class TextureMapping3D { public: <<TextureMapping3D Interface>> 
virtual ~TextureMapping3D() { } virtual Point3f Map(const SurfaceInteraction &si, Vector3f *dpdx, Vector3f *dpdy) const = 0;
};

<<TextureMapping3D Interface>>= 
virtual Point3f Map(const SurfaceInteraction &si, Vector3f *dpdx, Vector3f *dpdy) const = 0;

The natural 3D mapping just takes the world space coordinate of the point and applies a linear transformation to it. This will often be a transformation that takes the point back to the primitive’s object space.

<<Texture Declarations>>+=  
class TransformMapping3D : public TextureMapping3D { public: <<TransformMapping3D Public Methods>> 
private: const Transform WorldToTexture; };

Because a linear mapping is used, the differential change in texture coordinates can be found by applying the same mapping to the partial derivatives of position.

<<Texture Method Definitions>>+=  
Point3f TransformMapping3D::Map(const SurfaceInteraction &si, Vector3f *dpdx, Vector3f *dpdy) const { *dpdx = WorldToTexture(si.dpdx); *dpdy = WorldToTexture(si.dpdy); return WorldToTexture(si.p); }