Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
In probability theory and statistics, the kth moment about the mean (or kth central moment) of a real-valued random variable X is the quantity E[(X − E[X])k], where E is the expectation operator. Some random variables have no mean, in which case the moment about the mean is not defined. The kth moment about the mean is often denoted μk. For a continuous univariate probability distribution with probability density function f(x) the moment about the mean μ is
Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the nth-order moment about the origin to the moment about the mean is
where m is the mean of the distribution, and the moment about the origin is given by
The first moment about the mean is zero. The second moment about the mean is called the variance, and is usually denoted σ2, where σ represents the standard deviation. The third and fourth moments about the mean are used to define the standardized moments which are in turn used to define skewness and kurtosis, respectively.
For n ≥ 2, the nth central moment is translation-invariant, i.e. for any random variable X and any constant c, we have
For all n, the nth central moment is homogeneous of degree n:
Only for n ≤ 3 do we have an additivity property for random variables X and Y that are independent:
A related functional that shares the translation-invariance and homogeneity properties with the nth central moment, but continues to have this additivity property even when n ≥ 4 is the nth cumulant κn(X). For n = 1, the nth cumulant is just the expected value; for n = either 2 or 3, the nth cumulant is just the nth central moment; for n ≥ 4, the nth cumulant is an nth-degree monic polynomial in the first n moments (about zero), and is also a (simpler) nth-degree polynomial in the first n central moments.
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|