Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences. Suppose X is a random variable and that all of the moments


exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If

\lim_{n\to\infty}\operatorname{E}(X_n^k) = \operatorname{E}(X^k)\,

for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments is especially useful for proving limits theorems for random matrices with independent entries, such as Wigner's semi-circle law.

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.