Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


In statistics, exponential smoothing refers to a particular type of moving average technique applied to time series data, either to produce smoothed data for presentation, or to make forecasts. The time series data themselves are a sequence of observations. The observed phenomenon may be an essentially random process, or it may be an orderly, but noisy, process.

Exponential smoothing is commonly applied to financial market and economic data, but it can be used with any discrete set of repeated measurements. The raw data sequence is often represented by {xt}, and the output of the exponential smoothing algorithm is commonly written as {st} which may be regarded as our best estimate of what the next value of x will be. When the sequence of observations begins at time t = 0, the simplest form of exponential smoothing is given by the formulas

where α is the smoothing factor, and 0 < α < 1.

The simple moving average[]

Intuitively, the simplest way to smooth a time series is to calculate a simple, or unweighted, moving average. The smoothed statistic st is then just the mean of the last k observations:

where the choice of an integer k > 1 is arbitrary. A small value of k will have less of a smoothing effect and be more responsive to recent changes in the data, while a larger k will have a greater smoothing effect, and produce a more pronounced lag in the smoothed sequence. One disadvantage of this technique is that it cannot be used on the first k −1 terms of the time series.

The weighted moving average[]

A slightly more intricate method for smoothing a raw time series {xt} is to calculate a weighted moving average by first choosing a set of weighting factors

such that

and then using these weights to calculate the smoothed statistics {st}:

In practice the weighting factors are often chosen to give more weight to the most recent terms in the time series and less weight to older data. Notice that this technique has the same disadvantage as the simple moving average technique (i.e., it cannot be used until at least k observations have been made), and that it entails a more complicated calculation at each step of the smoothing procedure.

The exponential moving average[]

The simplest form of exponential smoothing is given by the formulas

where α is the smoothing factor, and 0 < α < 1. In other words, the smoothed statistic st is a simple weighted average of the latest observation xt and the previous smoothed statistic st−1. Simple exponential smoothing is easily applied, and it produces a smoothed statistic as soon as two observations are available.

Values of α close to unity have less of a smoothing effect and give greater weight to recent changes in the data, while values of α closer to zero have a greater smoothing effect and are less responsive to recent changes. There is no formally correct procedure for choosing α. Sometimes the statistician's judgment is used to choose an appropriate factor. Alternatively, a statistical technique may be used to optimize the value of α. For example, the method of least squares might be used to determine the value of α for which the sum of the quantities (sn-1 − xn)2 is minimized.

This simple form of exponential smoothing is also known as "Brown's exponential smoothing" and as an "exponentially weighted moving average". Technically it can also be classified as an ARIMA(0,1,1) model with no constant term.* [1]

Why is it "exponential"?[]

By direct substitution of the defining equation for simple exponential smoothing back into itself we find that

In other words, as time passes the smoothed statistic st becomes the weighted average of a greater and greater number of the past observations xt−n, and the weights assigned to previous observations are in general proportional to the terms of the geometric progression {1, (1−α), (1−α)2, (1−α)3, …}. A geometric progression is the discrete version of an exponential function, so that's how this smoothing method got its name.

See also[]

Notes[]

  1. "ARIMA" is an acronym for AutoRegressive Integrated Moving Average.

External links[]

This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement