# Minimum mean-square error

*34,190*pages on

this wiki

Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |

Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |

**Statistics:**
Scientific method ·
Research methods ·
Experimental design ·
Undergraduate statistics courses ·
Statistical tests ·
Game theory ·
Decision theory

In statistics and signal processing, a **minimum mean square error** (**MMSE**) estimator describes the approach which minimizes the mean square error (MSE), which is a common measure of estimator quality.

The term MMSE specifically refers to estimation in a Bayesian setting, since in the alternative frequentist setting there does not exist a single estimator having minimal MSE. A somewhat similar concept can be obtained within the frequentist point of view if one requires unbiasedness, since an estimator may exist that minimizes the variance (and hence the MSE) among unbiased estimators. Such an estimator is then called the minimum-variance unbiased estimator (MVUE).

## Contents

[show]## DefinitionEdit

Let be an unknown random variable, and let be a known random variable (the measurement). An estimator is any function of the measurement , and its MSE is given by

where the expectation is taken over both and .

The MMSE estimator is then defined as the estimator achieving minimal MSE.

In many cases, it is not possible to determine a closed form for the MMSE estimator. In these cases, one possibility is to seek the technique minimizing the MSE within a particular class, such as the class of linear estimators. The **linear MMSE** estimator is the estimator achieving minimum MSE among all estimators of the form . If the measurement is a random vector, is a matrix and is a vector. (Such an estimator would more correctly be termed an *affine MMSE* estimator, but the term linear estimator is widely used.)

## PropertiesEdit

- Under some weak regularity assumptions,
^{[1]}the MMSE estimator is uniquely defined, and is given by

- In other words, the MMSE estimator is the conditional expectation of given the observed value of the measurements.

- If and are jointly Gaussian, then the MMSE estimator is linear, i.e., it has the form for constants and . As a consequence, to find the MMSE estimator, it is sufficient to find the linear MMSE estimator. Such a situation occurs in the example presented in the next section.

- The orthogonality principle: An estimator is MMSE if and only if

- for all functions of the measurements. A different version of the orthogonality principle exists for linear MMSE estimators.

## ExampleEdit

An example can be shown by using a linear combination of random variable estimates and to estimate another random variable using If the random variables are real Gaussian random variables with zero mean and covariance matrix given by

we will estimate the vector and find coefficients such that the estimate is an optimal estimate of We will use the autocorrelation matrix, R, and the cross correlation matrix, C, to find vector A, which consists of the coefficient values that will minimize the estimate. The autocorrelation matrix is defined as

The cross correlation matrix is defined as

In order to find the optimal coefficients by the orthogonality principle we solve the equation by inverting and multiplying to get

So we have and
as the optimal coefficients for Computing the minimum
mean square error then gives .^{[2]}

A shorter, non-numerical example can be found in orthogonality principle.

## See alsoEdit

- Bayesian estimator
- Mean squared error
- Minimum-variance unbiased estimator (MVUE)
- Orthogonality principle

## NotesEdit

## Further readingEdit

- Johnson, D. (22 November 2004).
*Minimum Mean Squared Error Estimators*. Connexions -
*Prediction and Improved Estimation in Linear Models*, by J. Bibby, H. Toutenburg (Wiley, 1977). This book looks almost exclusively at minimum mean-square error estimation and inference. - Jaynes, E. T.
*Probability Theory: The Logic of Science*. Cambridge University Press, 2003. - Lehmann, E. L.; Casella, G. (1998).
*Theory of Point Estimation*, 2nd ed, ch. 4, Springer. - Kay, S. M. (1993).
*Fundamentals of Statistical Signal Processing: Estimation Theory*, 344–350, Prentice Hall. - Moon, T.K. and W.C. Stirling. Mathematical Methods and Algorithms for Signal Processing. Prentice Hall. 2000.

This page uses Creative Commons Licensed content from Wikipedia (view authors). |