Wikia

Psychology Wiki

Iteratively reweighted least squares

Talk0
34,140pages on
this wiki
Revision as of 19:35, July 3, 2013 by Dr Joe Kiff (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems. It solves objective functions of the form:

\underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n w_i (\boldsymbol\beta) \big| y_i - f_i (\boldsymbol\beta) \big|^2,

by an iterative method in which each step involves solving a weighted least squares problem of the form:

\boldsymbol\beta^{(t+1)} = \underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n w_i (\boldsymbol\beta^{(t)}) \big| y_i - f_i (\boldsymbol\beta) \big|^2.

IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.

Although not a linear regression problem, Weiszfeld's algorithm for approximating the geometric median can also be viewed as a special case of iteratively reweighted least squares, in which the objective function is the sum of distances of the estimator from the samples.

One of the advantages of IRLS over linear and convex programming is that it can be used with Gauss–Newton and Levenberg–Marquardt numerical algorithms.

Examples Edit

L1 minimization for sparse recovery Edit

IRLS can be used for \ell1 minimization and smoothed \ellp minimization, p < 1, in the compressed sensing problems. It has been proved that the algorithm has a linear rate of convergence for \ell1 norm and superlinear for \ell t with t < 1, under the restricted isometry property, which is generally a sufficient condition for sparse solutions.[1][2]

Lp norm linear regression Edit

To find the parameters β = (β1, …,βk)T which minimize the Lp norm for the linear regression problem,


\underset{\boldsymbol \beta}{ \operatorname{arg\,min} }
    \big\| \mathbf y - X \boldsymbol \beta \|_p
 =
\underset{\boldsymbol \beta}{ \operatorname{arg\,min} }
     \sum_{i=1}^n  \left| y_i - X_i \boldsymbol\beta \right|^p ,

the IRLS algorithm at step t+1 involves solving the weighted linear least squaresTemplate:Disambiguation needed problem:[3]


\boldsymbol\beta^{(t+1)}
 =
\underset{\boldsymbol\beta}{ \operatorname{arg\,min} }
    \sum_{i=1}^n w_i^{(t)}  \left| y_i - X_i \boldsymbol\beta \right|^2
 =
(X^{\rm T} W^{(t)} X)^{-1} X^{\rm T} W^{(t)} \mathbf{y},

where W(t) is the diagonal matrix of weights with elements:

w_i^{(t)} = \big|y_i - X_i \boldsymbol \beta ^{(t)} \big|^{p-2} .

In the case p = 1, this corresponds to least absolute deviation regression (in this case, the problem would be better approached by use of linear programming methods).[citation needed]

Notes Edit

  1. (March 31 – April 4, 2008) "Iteratively reweighted algorithms for compressive sensing". IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2008: 3869–3872. 
  2. I Daubechies et al (2008). Iteratively reweighted least squares minimization for sparse recovery. URL accessed on 2010-11-02.
  3. Gentle, James (2007). "6.8.1 Solutions that Minimize Other Norms of the Residuals" Matrix algebra, New York: Springer.

References Edit

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Around Wikia's network

Random Wiki