# Conditional probability

## Redirected from Marginal probability

*34,203*pages on

this wiki

## Ad blocker interference detected!

### Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.

Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |

Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |

**Statistics:**
Scientific method ·
Research methods ·
Experimental design ·
Undergraduate statistics courses ·
Statistical tests ·
Game theory ·
Decision theory

This article defines some terms which characterize probability distributions of two or more variables.

**Conditional probability** is the probability of some event *A*, given the occurrence of some other event *B*.
Conditional probability is written *P*(*A*|*B*), and is read "the probability of *A*, given *B*".

**Joint probability** is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of *A* and *B* is written or

**Marginal probability** is the probability of one event, regardless of the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the unrequired event. This is called **marginalization**. The marginal probability of *A* is written *P*(*A*), and the marginal probability of *B* is written *P*(*B*).

In these definitions, note that there need not be a causal or temporal relation between *A* and *B*. *A* may precede *B*, or vice versa, or they may happen at the same time. *A* may cause *B*, or vice versa, or they may have no causal relation at all.

**Conditioning** of probabilities, i.e. updating them to take account of (possibly new) information, may be achieved through Bayes' theorem.

## DefinitionEdit

Given events (or subsets) *A* and *B* in the sample space (also termed by some textbooks as the universe) , if it is known that an element randomly drawn from belongs to *B*, then the probability that it also belongs to *A* is *defined* to be the conditional probability of *A*, given *B*. From this definition, one can derive the following formula

Now, divide the denominator and numerator by to obtain

Equivalently, we have

## Statistical independenceEdit

Two random events *A* and *B* are statistically independent if and only if

Thus, if *A* and *B* are independent, then their joint probability can be expressed as a simple product of their individual probabilities.

Equivalently, for two independent events *A* and *B*,

and

In other words, if *A* and *B* are independent, then the conditional probability of *A*, given *B* is simply the individual probability of *A* alone; likewise, the probability of *B* given *A* is simply the probability of *B* alone.

## Mutual exclusivityEdit

Two events *A* and *B* are mutually exclusive if and only if

as long as

and

Then

and

In other words, the probability of *A* happening, given that *B* happens, is nil since *A* and *B* cannot both happen in the same situation; likewise, the probability of *B* happening, given that *A* happens, is also nil.

## Other considerationsEdit

- If is an event and , then the function defined by for all events is a probability measure.

- If , then is left undefined.

- Conditional probability can be calculated with a decision tree.

## The conditional probability fallacyEdit

The conditional probability fallacy is the assumption that *P*(*A*|*B*) is approximately equal to or is influenced by *P*(*B*|*A*). The mathematician John Allen Paulos discusses this in his book Innumeracy, where he points out that it is a mistake often made even by doctors, lawyers, and other highly educated non-statisticians. It can be overcome by describing the data in actual numbers rather than probabilities.

## See alsoEdit

- Bayes' theorem
- Likelihood function
- Posterior probability
- Probability theory
- Monty Hall problem
- Prosecutor's fallacy
- Conditional expectation

es:Probabilidad condicionada eo:Vikipedio:Projekto matematiko/Kondiĉa probablo fr:Probabilité conditionnellenl:Voorwaardelijke kansru:Условная вероятность su:Conditional probability sv:Betingad sannolikhet vi:Xác suất có điều kiện uk:Умовна ймовірність

This page uses Creative Commons Licensed content from Wikipedia (view authors). |