i
i
i
i
i
i
i
i
10.11. Tone Mapping 479
Figure 10.25. The effect of varying light level [881]. The brighter images also have a
post-process bloom effect applied to them. (Images from “Half-Life 2: Lost Coast”
courtesy of Valve Corp.)
needed to compute a good approximation of the current frame’s average
luminance.
Once this log-average luminance is computed, a variety of tone operators
are possible. For example, one simple mapping is
L(x, y)=
a
L
w
L
w
(x, y), (10.10)
where L(x, y) is the resulting luminance. The a parameter is the key of
the scene. High-key in photography means that contrast and shadows are
minimized and all is brightly (but not overly) lit; low-key tends to maximize
contrast between light and dark. A bride in a white dress against a white
background is high-key, a wolf backlit by the moon is low-key. A normal
key value is around a =0.18, high-key up to 0.72, low-key down to 0.045.
Within the area of tone mapping, the key value is similar in effect to the
exposure level in a camera, and sometimes the two are used interchangeably.
True high-key and low-key photography are functions of the type of lighting
in a scene, not the exposure level. An example of the effect of varying the
mapping is shown in Figure 10.25.
This equation tends to compress both the high and low luminance val-
ues. However, normal and low luminance values often hold more visual
detail, so an improved equation is
L
d
(x, y)=
L(x, y)
1+L(x, y)
. (10.11)
This adjustment compresses high luminance values to approach a value of
1, while low luminance values are nearly untouched.
i
i
i
i
i
i
i
i
480 10. Image-Based Effects
Some luminance values can be extremely bright in an image, e.g., the
sun. One solution is to choose a maximum luminance, L
white
, and use the
variant equation
L
d
(x, y)=
L(x, y)
1+
L(x,y)
L
2
white
1+L(x, y)
, (10.12)
which blends between the previous equation and a linear mapping.
While the process of tone mapping sounds complex and time consum-
ing, the overall effect can cost as little as 1% of total rendering time [1174].
One caveat with non-linear tone mapping (i.e., using any function more
complex than rescaling) is that it can cause problems similar to those dis-
cussed in Section 5.8 on gamma correction. In particular, roping artifacts
along antialiased edges will be more noticeable, and mipmap generation of
textures will not take into account the tone mapping operator used. Pers-
son [1008] discusses how to correct this problem in DirectX 10 by tone
mapping the multisample buffer and then resolving it to the final image.
This section has covered some of the more popular global tone operators
used in interactive applications. The field itself has considerably more
research done in the area of local tone operators, for example. See the
books by Reinhard et al. [1059] and Pharr and Humphreys [1010] for more
information about tone mapping.
10.11.1 High Dynamic Range Imaging and Lighting
Tone mapping is used at the end of the rendering process, when lighting
calculations are done. Lighting calculations themselves can suffer if the
precision of the light source data is not high enough. One problem with
using environment mapping to portray lighting is that the dynamic range
of the light captured is often limited to eight bits per color channel [231].
Directly visible light sources are usually hundreds to thousands of times
brighter than the indirect illumination (bounced light from walls, ceilings,
and other objects in the environment), so eight bits are not enough to
simultaneously capture the full range of incident illumination. For example,
suppose a highly specular object (say a brass candlestick, reflecting 70% of
the light) and a darker specular object (say an ebony statue, reflecting 5%
of the light) use the same environment map (EM). If the EM contains just
the lights in the room, the statue will look good and the candlestick will
not; if the EM contains all the objects in the room, the opposite is true.
This is because the statue is only shiny enough to visibly reflect direct light
sources and little else, while the candlestick will reflect a nearly mirror-like
image of the environment. Even if a single image is used and scaled in some
fashion, the lack of precision will often show up as color banding artifacts.
i
i
i
i
i
i
i
i
10.11. Tone Mapping 481
The solution is, at least in theory, simple enough: Use higher precision
environment maps. The acquisition of real-world environment data can be
done with specialized photographic methods and equipment. For example,
a camera might take three photos of the same scene at different exposures,
which are then combined into a single high dynamic range image. The
process of capturing such images is called high dynamic range imaging
(HDRI).
Tone mapping is intimately connected with HDRI, since an HDR image
is not displayable without mapping its values to the display in some fashion.
Tone-mapped images can have a rich visual quality, as detail can be exposed
in areas that might normally be too dark or overexposed. An example is
of a photo taken inside a room, looking out a window. Overall indoor il-
lumination is normally considerably less than that found outside. Using
HDRI acquisition and tone mapping gives a resulting image that shows
both environments simultaneously. Reinhard et al. [1059] discuss cap-
ture and use of HDR images in computer graphics in their comprehensive
book.
Most image file formats store display pixel values ranging from 0 to 255.
When performing physically accurate simulations, results are computed in
floating point radiance values and then tone mapped and gamma-corrected
for display. Ward [1325] developed the RGBE format to store the computed
values, instead of those finally displayed. RGBE retains the floating-point
accuracy of results by using three 8-bit mantissas for RGB and an 8-bit
exponent for all three values. In this way, the original radiance computa-
tions can be stored at higher precision at little additional cost, just 32 bits
versus the usual 24. DirectX 10 adds a variant format, R9G9B9E5, which
offers greater color precision with a reduced exponent range (5 bits). This
shared-exponent format type is normally used only as an input (texture)
format, due to the difficulty in rapidly determining a good exponent that
works for all three channels. A DirectX 10 compact HDR format that is
supported for output is the R11G11B10
FLOAT format, where R and G
each have 6 bits of mantissa and B has 5, and each channel has a separate
5-bit exponent [123].
In 2003, Industrial Light and Magic released code for its OpenEXR
format [617, 968]. This is a high dynamic range file format that supports
a number of pixel depth formats, including a 16-bit half-precision floating
point format (per channel) that is supported by graphics accelerators.
The term HDR lighting is also loosely used to refer to a wider range
of effects beyond higher precision images. For example, the term is often
used to also mean the glow effect that is created using image post process-
ing. The section that follows describes a variety of phenomena and their
simulation that provide the viewer with visual cues that some part of the
scene is bright.
i
i
i
i
i
i
i
i
482 10. Image-Based Effects
10.12 Lens Flare and Bloom
Lens are is a phenomenon that is caused by the lens of the eye or camera
when directed at bright light. It consists of a halo and a ciliary corona.
The halo is caused by the radial fibers of the crystalline structure of the
lens. It looks like a ring around the light, with its outside edge tinged
with red, and its inside with violet. The halo has a constant apparent
size, regardless of the distance of the source. The ciliary corona comes
from density fluctuations in the lens, and appears as rays radiating from a
point, which may extend beyond the halo [1208].
Camera lenses can also create secondary effects when parts of the lens
reflect or refract light internally. For example, hexagonal patterns can
appear due to the lens’s aperture blades. Streaks of light can also be seen
to smear across a windshield, due to small grooves in the glass [951]. Bloom
is caused by scattering in the lens and other parts of the eye, creating a
glow around the light and dimming contrast elsewhere in the scene. In
video production, the camera captures an image by converting photons
to charge using a charge-coupled device (CCD). Bloom occurs in a video
camera when a charge site in the CCD gets saturated and overflows into
neighboring sites. As a class, halos, coronae, and bloom are called glare
effects.
In reality, most such artifacts are less and less commonly seen as camera
technology improves. However, these effects are now routinely added digi-
tally to real photos to denote brightness. Similarly, there are limits to the
light intensity produced by the computer monitor, so to give the impres-
sion of increased brightness in a scene or objects, these types of effects are
explicitly rendered. The bloom effect and lens flare are almost interactive
computer graphics clices, due to their common use. Nonetheless, when
skillfully employed, such effects can give strong visual cues to the viewer;
for example, see Figure 8.16 on page 306.
Figure 10.26 shows a typical lens flare. It is produced by using a set of
textures for the glare effects. Each texture is applied to a square that is
made to face the viewer, so forming a billboard. The texture is treated as
an alpha map that determines how much of the square to blend into the
scene. Because it is the square itself that is being displayed, the square
can be given a color (typically a pure red, green, or blue) for prismatic
effects for the ciliary corona. Where they overlap, these sprites are blended
using an additive effect to get other colors. Furthermore, by animating the
ciliary corona, we create a sparkle effect [649].
To provide a convincing effect, the lens flare should change with the
position of the light source. King [662] creates a set of squares with different
textures to represent the lens flare. These are then oriented on a line going
from the light source position on screen through the screen’s center. When
i
i
i
i
i
i
i
i
10.12. Lens Flare and Bloom 483
Figure 10.26. A lens flare and its constituent textures. On the right, a halo and a bloom
are shown above, and two sparkle textures below. These textures are given color when
rendered. (Images from a Microsoft DirectX SDK program.)
the light is far from the center of the screen, the sprites are small and
more transparent, becoming larger and more opaque as the light moves
inwards. Maughan [826] varies the brightness of a lens flare by using only
the GPU to compute the occlusion of an onscreen area light source. He
generates a single-pixel intensity texture that is then used to attenuate the
brightness of the effect. Sekulic [1145] renders the light source as a single
polygon, using the occlusion query hardware to give a pixel count of the
area visible (see Section 14.6.1). To avoid GPU stalls from waiting for the
query to return a value to the CPU, the result is used in the next frame to
determine the amount of attenuation. This idea of gathering information
and computing a result for use in the next frame is important, and can be
used with most of the techniques discussed in this section.
Streaks from bright objects or lights in a scene can be performed in a
similar fashion by either drawing semitransparent billboards or performing
post-processing filtering on the bright pixels themselves. Oat [951] discusses
using a steerable filter to produce the streak effect. Instead of filtering
symmetrically over an area, this type of filter is given a direction. Texel
values along this direction are summed together, which produces a streak
effect. Using an image downsampled to one quarter of the width and height,
and two passes using ping-pong buffers, gives a convincing streak effect.
Figure 10.27 shows an example of this technique.
The bloom effect, where an extremely bright area spills over into ad-
joining pixels, is performed by combining a number of techniques already
presented. The main idea is to create a bloom image consisting only of the
bright objects that are to be “overexposed,” blur this, then composite it
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset