Individual differences |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
The motivation for studying empirical processes is that it is often impossible to know the true underlying probability measure . We collect observations and compute relative frequencies. We can estimate , or a related distribution function by means of the empirical measure or empirical distribution function, respectively. Theorems in the area of empirical processes confirm that these are uniformly good estimates or determine accuracy of the estimation.
Suppose is a sample space of observations. can be quite general; for example: the real line, some Euclidean space, a space of functions, a Riemannian manifold, or whatever might be of interest. Let be independent identically distributed (iid) random variables (rv's), with probability measure on . For a measurable set , the empirical measure is defined as
If is a collection of subsets of , then the collection
is the empirical measure indexed by . The empirical process is defined as
is the empirical process indexed by
A special case is the empirical process associated with empirical distribution functions .
where are real-valued random variables with distribution function and is defined by
In this case,
Major results for this special case include Kolmogorov-Smirnov statistics, the Glivenko-Cantelli theorem and Donsker's theorem. Moreover, the empirical distribution function of a finite sequence of realizations of a random variable is the very essence of statistical inference.
Glivenko-Cantelli theorem Edit
By the strong law of large numbers, we know that
However, Glivenko and Cantelli strengthened this result.
The Glivenko-Cantelli theorem (1933):
Another way to state this is as follows: the sample paths of get uniformly closer to as increases; hence , which we observe, is almost surely a good approximation for , which becomes better as we collect more observations.
Donsker's theorem Edit
By the classical central limit theorem, it follows that
that is, converges in distribution to a Gaussian (normal) random variable with mean 0 and variance Donsker (1952) showed that the sample paths of , as functions on the real line , converge in distribution to a stochastic process in the space ∞ of all bounded functions . The function space ∞ is used in this context to remind us that we are concerned with distributional convergence in terms of sample paths. The limit process is a Gaussian process with zero mean and covariance given by
- cov[G(s), G(t)] = E[G(s)G(t)] = F[min(s, t)] − F(s)F(t).
The process can be written as where is a standard Brownian bridge on the unit interval.
If the observations are in a more general sample space , we seek generalizations of the Glivenko-Cantelli theorem and Donsker's theorem. Also, we seek other theorems to determine rates of convergence and accuracy of estimation.
The classical empirical distribution function for real-valued random variables is a special case of the general theory with = and the class of sets .
- P. Billingsley, Probability and Measure, John Wiley and Sons, New York, third edition, 1995.
- M.D. Donsker, Justification and extension of Doob's heuristic approach to the Kolmogorov-Smirnov theorems, Annals of Mathematical Statistics, 23:277--281, 1952.
- R.M. Dudley, Central limit theorems for empirical measures, Annals of Probability, 6(6): 899â€“929, 1978.
- R.M. Dudley, Uniform Central Limit Theorems, Cambridge Studies in Advanced Mathematics, 63, Cambridge University Press, Cambridge, UK, 1999.
- J. Wolfowitz, Generalization of the theorem of Glivenko-Cantelli. Annals of Mathematical Statistics, 25, 131-138, 1954.
- Empirical Processes: Theory and Applications, by David Pollard, a textbook available online.
- Introduction to Empirical Processes and Semiparametric Inference, by Michael Kosorok, another textbook available online.
- it:Teorema di Glivenko-Cantelli
- ru:Теорема Гливенко — Кантелли
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|