# Markov process

*34,142*pages on

this wiki

Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |

Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |

**Statistics:**
Scientific method ·
Research methods ·
Experimental design ·
Undergraduate statistics courses ·
Statistical tests ·
Game theory ·
Decision theory

A **Markov process**, named after the Russian mathematician Andrey Markov, is a mathematical model for the random evolution of a memoryless system. Often the property of being 'memoryless' is expressed such that *conditional on the present state of the system, its future and past are independent*.

Mathematically, the Markov process is expressed as for any *n* and

Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.

Mathematically, if *X*(*t*), *t* > 0, is a stochastic process, the Markov property states that

Markov processes are typically termed *(time-) homogeneous* if

and otherwise are termed *(time-) inhomogeneous* (or *(time-) nonhomogeneous*). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let *X* be a non-Markovian process. Then define a process *Y*, such that each state of *Y* represents a time-interval of states of *X*, i.e. mathematically,

If *Y* has the Markov property, then it is a Markovian representation of *X*. In this case, *X* is also called a **second-order Markov process**. **Higher-order Markov processes** are defined analogously.

An example of a non-Markovian process with a Markovian representation is a moving average time series.

## ReferencesEdit

- Eric W. Weisstein,
*Markov process*at MathWorld.

## See alsoEdit

- Examples of Markov chains
- Memorylessness
- Semi-Markov process
- Andrey Markov
- Markov chain
- Markov decision process
- Dynamics of Markovian particles

This page uses Creative Commons Licensed content from Wikipedia (view authors). |