Psychology Wiki

Probability distribution

Redirected from Probability distributions

34,201pages on
this wiki
Add New Page
Add New Page Talk0

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory

A probability distribution describes the values and probabilities that a random event can take place. The values must cover all of the possible outcomes of the event, while the total probabilities must sum to exactly 1, or 100%. For example, a single coin flip can take values Heads or Tails with a probability of exactly 1/2 for each; these two values and two probabilities make up the probability distribution of the single coin flipping event. This distribution is called a discrete distribution because there are a countable number of discrete outcomes with positive probabilities.

A continuous distribution describes events over a continuous range, where the probability of a specific outcome is zero. For example, a dart thrown at a dartboard has essentially zero probability of landing at a specific point, since a point is vanishingly small, but it has some probability of landing within a given area. The probability of landing within the small area of the bullseye would (hopefully) be greater than landing on an equivalent area elsewhere on the board. A smooth function that describes the probability of landing anywhere on the dartboard is the probability distribution of the dart throwing event. The integral of the probability density function (pdf) over the entire area of the dartboard (and, perhaps, the wall surrounding it) must be equal to 1, since each dart must land somewhere.

The concept of the probability distribution and the random variables which they describe underlies the mathematical discipline of probability theory, and the science of statistics. There is spread or variability in almost any value that can be measured in a population (e.g. height of people, durability of a metal, etc.); almost all measurements are made with some intrinsic error; in physics many processes are described probabilistically, from the kinetic properties of gases to the quantum mechanical description of fundamental particles. For these and many other reasons, simple numbers are often inadequate for describing a quantity, while probability distributions are often more appropriate models. There are, however, considerable mathematical complications in manipulating probability distributions, since most standard arithmetic and algebraic manipulations cannot be applied.

Rigorous definitions Edit

In probability theory, every random variable may be attributed to a function defined on a state space equipped with a probability distribution that assigns a probability to every subset (more precisely every measurable subset) of its state space in such a way that the probability axioms are satisfied. That is, probability distributions are probability measures defined over a state space instead of the sample space. A random variable then defines a probability measure on the sample space by assigning a subset of the sample space the probability of its inverse image in the state space. In other words the probability distribution of a random variable is the push forward measure of the probability distribution on the state space.

Probability distributions of real-valued random variablesEdit

Because a probability distribution Pr on the real line is determined by the probability of being in a half-open interval Pr(ab], the probability distribution of a real-valued random variable X is completely characterized by its cumulative distribution function:

 F(x) = \Pr \left[ X \le x \right] \qquad \forall x \in \mathbb{R}.

Discrete probability distributionEdit

Main article: Discrete probability distribution

A probability distribution is called discrete if its cumulative distribution function only increases in jumps.

The set of all values that a discrete random variable can assume with non-zero probability is either finite or countably infinite because the sum of uncountably many positive real numbers (which is the smallest upper bound of the set of all finite partial sums) always diverges to infinity. Typically, the set of possible values is topologically discrete in the sense that all its points are isolated points. But, there are discrete random variables for which this countable set is dense on the real line.

Discrete distributions are characterized by a probability mass function, p such that

F(x) = \Pr \left[X \le x \right] = \sum_{x_i \le x} p(x_i).

Continuous probability distributionEdit

Main article: Continuous probability distribution

By one convention, a probability distribution is called continuous if its cumulative distribution function is continuous, which means that it belongs to a random variable X for which Pr[ X = x ] = 0 for all x in R.

Another convention reserves the term continuous probability distribution for absolutely continuous distributions. These distributions can be characterized by a probability density function: a non-negative Lebesgue integrable function f defined on the real numbers such that

F(x) = \Pr \left[ X \le x \right] = \int_{-\infty}^x f(t)\,dt

Discrete distributions and some continuous distributions (like the devil's staircase) do not admit such a density.


The support of a distribution is the smallest closed set whose complement has probability zero.

The probability distribution of the sum of two independent random variables is the convolution of each of their distributions.

The probability distribution of the difference of two random variables is the cross-correlation of each of their distributions.

A discrete random variable is a random variable whose probability distribution is discrete. Similarly, a continuous random variable is a random variable whose probability distribution is continuous.

List of important probability distributions Edit

Certain random variables occur very often in probability theory, in some cases due to their application to many natural and physical processes, and in some cases due to theoretical reasons such as the central limit theorem, the Poisson limit theorem, or properties such as memorylessness or other characterizations. Their distributions therefore have gained special importance in probability theory.

Discrete distributionsEdit

With finite supportEdit

  • The Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q = 1 − p.
  • The Rademacher distribution, which takes value 1 with probability 1/2 and value −1 with probability 1/2.
  • The binomial distribution describes the number of successes in a series of independent Yes/No experiments.
  • The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. It is useful because it puts deterministic variables and random variables in the same formalism.
  • The discrete uniform distribution, where all elements of a finite set are equally likely. This is supposed to be the distribution of a balanced coin, an unbiased die, a casino roulette or a well-shuffled deck. Also, one can use measurements of quantum states to generate uniform random variables. All these are "physical" or "mechanical" devices, subject to design flaws or perturbations, so the uniform distribution is only an approximation of their behaviour. In digital computers, pseudo-random number generators are used to produce a statistically random discrete uniform distribution.
  • The hypergeometric distribution, which describes the number of successes in the first m of a series of n Yes/No experiments, if the total number of successes is known.
  • Zipf's law or the Zipf distribution. A discrete power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
  • The Zipf-Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.

With infinite supportEdit

Poisson distribution PMF

Poisson distribution


Continuous distributionsEdit

Supported on a bounded intervalEdit

File:Beta distribution pdf.png
  • The Beta distribution on [0,1], of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
File:Uniform distribution PDF.png

Supported on semi-infinite intervals, usually [0,∞)Edit

Chi-square distributionPDF

chi-square distribution

Exponential distribution pdf

Exponential distribution

Gamma distribution pdf

Gamma distribution

Pareto distributionPDF

Pareto distribution

Supported on the whole real lineEdit

File:Cauchy distribution pdf.png
File:Laplace distribution pdf.png
Normal distribution pdf

Normal distribution

Joint distributionsEdit

For any set of independent random variables the probability density function of their joint distribution is the product of their individual density functions.

Two or more random variables on the same sample spaceEdit

Matrix-valued distributionsEdit

Miscellaneous distributionsEdit

See also Edit

Bvn-small Probability distributions [[[:Template:Tnavbar-plain-nodiv]]]
Univariate Multivariate
Discrete: BernoullibinomialBoltzmanncompound PoissondegeneratedegreeGauss-Kuzmingeometrichypergeometriclogarithmicnegative binomialparabolic fractalPoissonRademacherSkellamuniformYule-SimonzetaZipfZipf-Mandelbrot Ewensmultinomial
Continuous: BetaBeta primeCauchychi-squareDirac delta functionErlangexponentialexponential powerFfadingFisher's zFisher-TippettGammageneralized extreme valuegeneralized hyperbolicgeneralized inverse GaussianHotelling's T-squarehyperbolic secanthyper-exponentialhypoexponentialinverse chi-squareinverse gaussianinverse gammaKumaraswamyLandauLaplaceLévyLévy skew alpha-stablelogisticlog-normalMaxwell-BoltzmannMaxwell speednormal (Gaussian)ParetoPearsonpolarraised cosineRayleighrelativistic Breit-WignerRiceStudent's ttriangulartype-1 Gumbeltype-2 GumbeluniformVoigtvon MisesWeibullWigner semicircle DirichletKentmatrix normalmultivariate normalvon Mises-FisherWigner quasiWishart
Miscellaneous: Cantorconditionalexponential familyinfinitely divisiblelocation-scale familymarginalmaximum entropy phase-typeposterior priorquasisampling

External linksEdit

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Also on Fandom

Random Wiki