跳转至

8.2 路径追踪

Path Tracing

我们已经看到光线追踪如何被扩展来近似一些基本算法无法处理的多种效果。接下来我们将看到一种算法,它以一种相当直接和统一的方式来考虑所有这些效果以及更多:路径追踪。与光线追踪一样,路径追踪通过从观察者那里向后追踪光线的路径,穿过图像上的点进入场景,来计算图像中点的颜色。但在路径追踪中,想法是考虑光线可能遵循的所有可能路径。当然,这并不是字面上可能的,但是追踪大量的路径可以给出一个很好的近似——随着路径数量的增加,这种近似会变得更好(“前向路径追踪”,即从光源发出的光线路径向前追踪,有时也被使用。)

We have seen how ray tracing can be extended to approximate a variety of effects that are not handled by the basic algorithm. We look next at an algorithm that accounts for all those effects and more in a fairly straightforward and unified way: path tracing. Like ray tracing, path tracing computes colors for points in an image by tracing the paths of light rays backwards from the viewer through points on the image and into the scene. But in path tracing, the idea is to account for all possible paths that the light could have followed. Of course, that is not literally possible, but following a large number of paths can give a good approximation—one that gets better as the number of paths is increased ("Forward path tracing," where paths of light rays emitted from light sources are traced forward in time, is also sometimes used.)

8.2.1 BSDF

BSDF's

为了模拟广泛的物理现象,路径追踪使用了材质属性概念的概括。在OpenGL中,一种材质是环境光、漫反射、镜面反射和自发光颜色的组合,再加上光泽度。这些属性(除了自发光颜色)模拟了表面与光的相互作用。材质属性可以在表面上的每个点上变化;这是纹理的一个例子。

OpenGL材质只是对现实的粗略近似。在路径追踪中,使用了一种更通用的概念,它能够更准确地表示几乎所有真实物理表面或体积的属性。替代材质的被称为双向散射分布函数BSDF)。

思考一下到达某点的光如何受到该点物质物理属性的影响。一些光可能被吸收。一些可能在没有受到影响的情况下通过该点。还有一些可能被“散射”,即朝另一个方向发送。事实上,我们将通过该点视为散射的一种特殊情况。双向散射分布函数描述了光是如何从表面或体积上的每个点散射的。

想象一下一束光,或光子,到达某点。它发生的情况可能取决于它到达的方向。通常,假设它没有被吸收,光更有可能在某些方向上散射而不是其他方向。(例如,在镜面反射中。)在该点的双向散射分布函数给出了光线以给定方向离开点的概率。它是“双向”的,因为答案是两个方向的函数,光到达的方向和你所询问的出射方向。(它在数学连续概率分布理论的意义上是“分布函数”,但你不需要理解这一点就能把握一般概念。对我们来说,只要理解这个函数说明了来自给定方向的光是如何在可能的出射方向中分布的就足够了。)注意,双向散射分布函数也是你所讨论的点的函数,它通常是光波长的函数。

任何空间中的点都可以被赋予一个双向散射分布函数。对于真空,双向散射分布函数是平凡的:它只是说到达某点的光有100%的概率继续沿相同方向前进。但是通过雾、尘土飞扬的空气或脏水的光有一定的被吸收的概率和被散射到随机方向的概率。类似的观点适用于通过半透明固体物体内部的光。

然而,传统上,计算机图形学主要关注光在物体表面发生的情况。光可以被吸收或反射,或者,如果物体是半透明的,可以通过表面传输。描述光从表面反射的函数有时被称为双向反射分布函数(BRDF),光传输的公式是双向传输分布函数(BTDF)。表面的双向散射分布函数是两者的组合。

让我们从双向散射分布函数的角度考虑OpenGL材质。在基本的OpenGL中,光只能被反射或吸收。对于漫反射,光在与表面法向量形成小于90度角的每个方向上被反射的概率相等,与光到达的方向无关。对于镜面反射,入射光的方向很重要。在OpenGL中,镜面反射光的可能出射方向形成一个锥体,其中锥体轴线与法向量之间的角度等于法向量与入射光方向之间的角度。锥体轴线是最可能的出射光方向,随着出射方向与轴线方向之间的角度增加,概率会下降。下降的速率由材质的光泽度属性指定。表面的BRDF结合了漫反射和镜面反射。(环境材质属性不适合BSDF框架,因为物理上没有一种“环境光”,它与普通光有些不同。)

光线追踪为光与表面的相互作用增加了两种新的可能性:完美的、像镜子一样的反射,其中出射光与法向量形成的角度与入射光完全相同,以及光传输到半透明物体中,其中出射角度由物体内外的折射率决定。

但是双向散射分布函数可以提供更现实的光与表面相互作用模型。例如,物体的镜面反射与光源的镜面反射之间的区别是人为的。完美的镜子应该以镜面方式反射光源和物体。对于一个光滑但粗糙的表面,所有的镜面反射都会将光朝一个方向的锥体发送,提供物体和光源的模糊图像。BSFD应该处理这两种情况,它不应该区分来自光源的光和从其他物体反射的光。

双向散射分布函数也可以正确处理一种称为次表面散射的现象,这对于只是稍微半透明的材料(如牛奶、玉和皮肤)来说可能是一个重要的视觉效果。在次表面散射中,击中表面的光可以被传输到物体内部,在物体内部散射几次,然后从表面的另一点出现。光在物体内部的行为由物体内部材料的双向散射分布函数决定。在这种情况下,双向散射分布函数将类似于雾的,除了散射的概率会更大。

关键是,几乎任何物理上真实的材料都可以通过正确选择的双向散射分布函数来建模。

In order to model a wide variety of physical phenomena, path tracing uses a generalization of the idea of material property. In OpenGL, a material is a combination of ambient, diffuse, specular, and emission colors, plus shininess. These properties, except for emission color, model how the surface interacts with light. Material properties can vary from point to point on a surface; that's an example of a texture.

OpenGL material is only a rough approximation of reality. In path tracing, a more general notion is used that is capable of more accurately representing the properties of almost any real physical surface or volume. The replacement for materials is call a BSDF, or Bidirectional Scattering Distribution Function.

Think about how light that arrives at some point can be affected by the physical properties of whatever substance exists at that point. Some of the light might be absorbed. Some might pass through the point without being affected at all. And some might be "scattered," that is, sent off in another direction. In fact, we consider passing through the point as a special case of scattering. A BSDF describes how light is scattered from each point on a surface or in a volume.

Think of a single ray, or photon, of light that arrives at some point. What happens to it can depend on the direction from which it arrives. In general, assuming that it is not absorbed, the light is more likely to be scattered in some directions than in others. (As in specular reflection, for example.) The BSDF at the point gives the probability that the ray will leave the point heading in a given direction. It is a "bidirectional" function because the answer is a function of two directions, the direction from which the light arrives and the outgoing direction that you are asking about. (It is a "distribution function" in the sense of the mathematical theory of continuous probability distributions, but you don't need to understand that to get the general idea. For us, it's enough to understand that the function says how light coming in from a given direction is distributed among possible outgoing directions.) Note that a BSDF is also a function of the point that you are talking about, and it is generally a function of the wavelength of the light as well.

Any point in space can be assigned a BSDF. For empty space, the BSDF is trivial: It simply says that light arriving at a point has a 100% probability of continuing in the same direction. But light passing through fog or dusty air or dirty water has some probability of being absorbed and some probability of being scattered to a random direction. Similar remarks apply to light passing through the interior of a translucent solid object.

Traditionally, though, computer graphics has been mostly concerned with what happens to light at the surface of an object. Light can be absorbed or reflected or, if the object is translucent, transmitted through the surface. The function that describes the reflection of light from a surface is sometimes called a BRDF (Bidirectional Reflectance Distribution Function), and the formula for transmission of light is a BTDF (Bidirectional Transmission Distribution function). The BSDF for a surface is a combination of the two.

Let's consider OpenGL materials in terms of BSDFs. In basic OpenGL, light can only be reflected or absorbed. For diffuse reflection, light has an equal probability of being reflected in every direction that makes an angle of less than 90 degrees with the normal vector to the surface, and there is no dependence on the direction from which the light arrives. For specular reflection, the incoming light direction matters. In OpenGL, the possible outgoing directions for specularly reflected light form a cone, where the angle between the axis of the cone and the normal vector is equal to the angle between the normal vector and the incoming light direction. The axis of the cone is the most likely direction for outgoing light, and the probability falls off as the angle between the outgoing direction and the direction of the axis increases. The rate of falloff is specified by the shininess property of the material. The BRFD for the surface combines the diffuse and specular reflection. (The ambient material property doesn't fit well into the BSDF framework, since physically there is no such thing as an "ambient light" that is somehow different from regular light.)

Ray tracing adds two new possibilities to the interaction of light with a surface: perfect, mirror-like reflection, where the outgoing light makes exactly the same angle with the normal vector as the incoming light, and transmission of light into a translucent object, where the outgoing angle is determined by the indices of refraction outside and inside the object.

But BSDFs can provide even more realistic models of the interaction of light with surfaces. For example, the distinction between mirror-like reflection of an object and specular reflection of a light source is artificial. A perfect mirror should reflect both light sources and objects in a mirror-like way. For a shiny but rough surface, all specular reflection would send the light in a cone of directions, giving fuzzy images of objects and lights alike. A BSFD should handle both cases, and it shouldn't distinguish between light from light sources and light reflected off other objects.

BSDFs can also correctly handle a phenomenon called subsurface scattering, which can be an important visual effect for materials that are just a bit translucent, such as milk, jade, and skin. In sub-surface scattering, light that hits a surface can be transmitted into the object, be scattered a few times internally inside the object, and then emerge from the surface at another point. How the light behaves inside the object is determined by the BSDF of the material in the interior of the object. The BSDF in this case would be similar to the one for fog, except that the probability of scattering would be larger.

The point is that just about any physically realistic material can be modeled by a correctly chosen BSDF.

8.2.2 路径追踪算法

The Path Tracing Algorithm

路径追踪基于一个被称为“渲染方程”的公式。该公式表明,在给定方向上离开某一特定点的光能量等于该点在该方向上发出的光能量,加上从其他来源到达该点然后被散射到该方向的光能量。

在这里,发出的光指的是被创造出来的光,如由光源产生。在渲染方程中,任何物体都可以是光的发射器。用OpenGL的术语来说,就像具有自发光颜色的物体实际上会发出可以照亮其他物体的光一样。区域光只是一个扩展的物体,它从每个点发出光,通常用大的发光物体来照亮场景是常见的。(实际上,在典型的路径追踪设置中,点光源和定向光必须被赋予一些区域,以使它们在算法中正确工作,或者必须使用一些前向路径追踪。)

至于散射光,某点的双向散射分布函数(BSDF)决定了到达该点的光是如何被散射的。光通常可以从任何方向到达,并且可以起源于场景中的任何其他点。渲染方程在每个点都成立。它将每个点上到达和离开的光与每个其他点上到达和离开的光联系起来。换句话说,它描述了一个极其复杂的系统,对于这个系统,你可能无法找到确切的解决方案。一个渲染算法可以被视为寻找渲染方程的良好近似解的尝试。

路径追踪是一种概率渲染算法。它观察可能由到达观察者位置的光所遵循的路径。每种可能的路径都有一定的概率。路径追踪生成可能路径的随机样本,根据它们的可能概率选择样本中的路径。它使用这些路径来创建一个近似于渲染方程解决方案的图像。可以证明,随着随机样本的大小增加,生成的图像将接近真实解决方案。要获得高质量的图像,算法将不得不为图像中的每个像素追踪数千条路径,但结果可能是几乎令人震惊的逼真度。

让我们思考一下它应该如何工作。首先,考虑光只由表面发射和反射的情况。与光线追踪一样,我们从观察者的位置开始,向图像上的某点方向投射一束光线,进入场景。(见8.1.1小节。)我们找到该光线与场景中物体的第一个交点。我们的目标是追踪一条可能的路径,这条光线可能从其起始点一直到达观察者,我们希望选择给定路径的概率是光实际沿着该路径传播的概率。这意味着每次光从表面散射时,我们都应根据表面的BSDF选择路径下一段的方向。也就是说,方向是随机选择的,使用的是BSDF中编码的概率分布。我们通过在选定方向上投射光线来构建路径的下一段。

我们继续追踪这条路径,时间上向后,可能经过多次反射,直到它遇到一个发射光的物体。那个物体作为光的原始来源。路径对图像所贡献的颜色由发射体的颜色和强度决定,由光沿途击中的表面的颜色决定,以及光击中每个表面的角度决定。如果路径在击中发光物体之前从场景中逸出,则它不对图像贡献任何颜色。(可能需要一个像天空一样的发光背景,它在大面积上发射光。)注意,一个物体既可以是光的发射器也可以是反射器。在这种情况下,即使到达光源之后,路径也可以继续。

当然,我们必须追踪许多这样的路径。图像中一个像素的颜色被计算为通过该像素的所有路径所获得颜色的平均值。

该算法可以扩展到处理光可以在空间中任意点散射的情况,而不仅仅是在表面上。对于在3D空间中传播的光,问题是,光在被散射之前会传播多远?介质的BSDF将决定散射之间可能的传播距离上的概率分布。当光进入介质时,使用该概率分布来选择光在被散射之前将传播的随机距离(除非它在传播该距离之前击中表面或进入新的介质)。当它从介质中的某点散射时,根据介质的BSDF随机选择下一段路径的新方向和长度。对于轻雾,散射之间的平均距离会相当大;对于像牛奶这样的密集介质,会相当短。


为了获得高质量的图像,需要大量的计算来追踪足够多的光路径。尽管路径追踪是在20世纪80年代发明的,但它直到最近才变得实用,而且仍然可能需要很多小时才能获得高质量的渲染。实际上,你可以使用3D建模程序Blender在你的桌面计算机上进行路径追踪,这在附录B中讨论。Blender有一个名为Cycles的渲染引擎,它使用路径追踪。有关Blender的更多信息,请参见附录。

Path tracing is based on a formula known as the "rendering equation." The formula says that the amount of light energy leaving a given point in a given direction is equal to the amount of light energy emitted by the point in that direction plus the amount of light energy arriving at the point from other sources that is then scattered in that direction.

Here, emitted light means light that is created, as by a light source. In the rendering equation, any object can be an emitter of light. In OpenGL terms, it's as if an object with an emission color actually emits light that can illuminate other objects. An area light is just an extended object that emits light from every point, and it is common to illuminate scenes with large light-emitting objects. (In fact, in a typical path tracing setup, point lights and directional lights have to be assigned some area to make them work correctly in the algorithm, or some forward path tracing has to be used.)

As for scattered light, the BSDF at a point determines how light arriving at that point is scattered. Light can, in general, arrive from any direction and can originate from any other point in the scene. The rendering equation holds at every point. It relates the light arriving at and departing from each point to the light arriving at and departing from every other point. It describes, in other words, an immensely complicated system, one for which you are unlikely to be able to find an exact solution. A rendering algorithm can be thought of as an attempt to find a good approximate solution to the rendering equation.

Path tracing is a probabilistic rendering algorithm. It looks at possible paths that might have been followed by light arriving at the position of the viewer. Each possible path has a certain probability. Path tracing generates a random sample of possible paths, choosing paths in the sample according to their probabilities. It uses those paths to create an image that approximates a solution to the rendering equation. It can be shown that as the size of the random sample increases, the image that is generated will approach the true solution. To get a good quality image, the algorithm will have to trace thousands of paths for each pixel in the image, but the result can be an almost shocking level of realism.

Let's think about how it should work. First, consider the case where light is only emitted and reflected by surfaces. As with ray tracing, we start at the position of the viewer and cast a ray in the direction of a point on the image, into the scene. (See Subsection 8.1.1.) We find the first intersection of that ray with an object in the scene. Our goal to trace one possible path that the ray could have followed from its point of origin until it arrives at the viewer, and we want the probability that we select a given path to be the probability that the light actually followed that path. This means that each time the light is scattered from a surface, we should choose the direction of the next segment of the path based on the BSDF for the surface. That is, the direction is chosen at random, using the probability distribution that is encoded in the BSDF. We construct the next segment of the path by casting a ray in the selected direction.

We continue to trace the path, backwards in time, possibly through multiple reflections, until it encounters an object that emits light. That object serves as the original source of the light. The color that the path contributes to the image is determined by the color and intensity of the emitter, by the colors of surfaces that the light hits along the way, and by the angles at which the light hits each surface. If the path escapes from the scene before it hits a light emitting object, then it does not contribute any color to the image. (It might be desirable to have a light-emitting background, like a sky, that emits light over a large area.) Note that it is possible for an object to be both an emitter and a reflector of light. In that case, a path can continue even after it gets to a light source.

Of course, we have to trace many such paths. The color for a pixel in the image is computed as an average of the colors obtained for all the paths that pass through that pixel.

The algorithm can be extended to handle the case where light can be scattered at arbitrary points in space, and not just at surfaces. For light traveling in a medium in 3D space, the question is, how far will the light travel before it is scattered? The BSDF for the medium will determine a probability distribution on possible travel distances between scatterings. When light enters a medium, that probability distribution is used to select a random distance that the light will travel before it is scattered (unless it hits a surface or enters a new medium before it has traveled that distance). When it scatters from a point in the medium, a new direction and length are chosen at random for the next segment of the path, according to the BSDF of the medium. For a light fog, the average distance between scatterings would be quite large; for a dense medium like milk, it would be quite short.


A great deal of computation is required to trace enough light paths to get a high-quality image. Although path tracing was invented in the 1980s, it is only recently that it has become practical for general use, and it can still take many hours to get high quality rendering. In fact, you can do path tracing on your desktop computer using the 3D modeling program Blender, which is discussed in Appendix B. Blender has a rendering engine called Cycles that uses path tracing. See the appendix for more information about Blender.