Smoothing is a class of time series processing which is intended to reduce noise and to preserve the signal itself. The origin of this term is related to the visual appearance of the time series – it looks smoother after this sort of processing than does the original time series.
The logical background for smoothing is the presumption that there is a a distinction between the properties of the “signal” and those of the “noise”: the signal is a smooth function , having no abrupt changes between two adjacent moments of time , while the noise is a function with abrupt random changes between adjacent time-points. If this is the case, a smoothing procedure will reduce the noise and preserve the signal.
Because it is not possible to cleanly separate signal from noise, smoothing is always a compromise between noise reduction and signal distortion. A high degree of smoothing with substantially reduce the noise, but will also reduce or distort the signal.
For example, it we have a sound signal, smoothing of this signal will inevitably result in a reduction of intensity of higher frequencies. If the higher frequencies are reduced too much, the quality of sound decreases.
Besides time series, which are 1-dimensional entities, smoothing is also used for processing of images (2-dimensional objects) and spatial fields (2 and 3-dimensional objects).
See also: Smoother (Smoothing Filter) .