Filtering is easier to visualize in one dimension. Sound recordings are examples of one-dimensional signals.
Below is a signal that is a sum of low and high frequency waves.
In frequency space, this signal has two spikes at frequencies 2 and 50. Mathematically speaking, decomposing a signal into its frequency components is done by means of a Fourier transform. A Fourier transform of a signal is also called its spectrum.
Filtering is done by averaging out multiple samples to get each new sample. The simplest averaging assigns the same weight to each sample within a small range. This corresponds to a box filter, a filter whose all components up to a fixed count have equal weight.
This is the mathematical formula that describes the filtering process.
The j'th sample of the result, yj, is a sum of five consecutive samples of the source signal, starting with xj, with weights given by the box filter. This formula is called a convolution of two sequences (Box and x). Filtering a signal means convoluting it with the filter.
The result of filtering is a new signal that has the low-frequency component intact, but with the high-frequency component gone. In every sum of five consecutive samples, the high-frequency component contributed an (almost) equal number of positive and negative terms. They averaged out to zero.
The goal of low pass filtering is getting rid of high frequencies (or fine detail, in the case of pictures). If frequency space, this would correspond to multiplying the spectrum of the signal by a function that is one for lower frequencies and zero for higher frequencies. We want the Fourier transform of our filter to look like a step function.
Given its Fourier transform, we would like to see what the filter looks like, so that we can do the averaging. We have to apply the inverse Fourier transform (which is virtually the same as straight FT) to our step function. The result is a sinc function:
sinck = sin (k)/k
Unlike a box filter, the sinc has both positive and negative values. They make sense when averaging a sound signal with positive and negative values. They don't make sense when filtering a bitmap, which has only positive values of RGB (we might get, for instance, a negative value for the red component of some pixel!). In addition, the sync function extends to infinity (although its value quickly decreases like 1/x). It means that for each sample of the output we theoretically have to average out an infinite number of input samples.
In practice, one either has to window a sinc (multiply it by a step function), or use other, simpler filters, like the box filter.
Other popular types of filters are triangular and Gaussian (bell-shaped). The interesting thing about a Gaussian filter (exp (-ax2)is that its FT is also a Gaussian.
In addition, one should also take into account the fact that not the whole pixel might be covered by a polygon. So to be exact, one should clip the polygon to the pixel and back-project such a fragment to texture space. Such calculations might in fact be done when using the A-buffer for anti-aliasing.
Notice that all these algorithms perform more calculations the more distant the polygon is and the smaller its image on the screen. This doesn't seem like a good allocation of computing resources.
There is a technique that can dramatically speed up the averaging over rectangular areas. Instead of storing the texture as a bitmap, one can create a summed area table. The elements of such a table are sums of the RGB components of texels over whole areas. An element indexed by (u, v) contains the sum of all texels located to the left and above the texel (u, v)--the upper-left rectangle. To calculate the sum of texels within a rectangle defined by (u1, v1, u2, v2), we add the elements
Sum [u2][v2] - Sum [u1][v2] - Sum [u2][v1] + Sum [u1][v1]
In Fig 5. we can see this construction geometrically. The small grey rectangle is obtained by subtracting the two red rectangles from the large blue rectangle and then adding back the smaller blue rectangle (to compensate for double coverage under two red rectangles).
Next: Using mip-maps to speed up texture mapping.