Method of moments (probability theory)

34,202pages on
this wiki

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences. Suppose X is a random variable and that all of the moments

$\operatorname{E}(X^k)\,$

exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If

$\lim_{n\to\infty}\operatorname{E}(X_n^k) = \operatorname{E}(X^k)\,$

for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments is especially useful for proving limits theorems for random matrices with independent entries, such as Wigner's semi-circle law.