Fandom

Psychology Wiki

Markov process

34,202pages on
this wiki
Add New Page
Talk0 Share

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


A Markov process, named after the Russian mathematician Andrey Markov, is a mathematical model for the random evolution of a memoryless system. Often the property of being 'memoryless' is expressed such that conditional on the present state of the system, its future and past are independent.

Mathematically, the Markov process is expressed as for any n and t_1<t_2<\cdots<t_n,

P[x(t_n) \le x_n~|~x(t)~\forall~t \le t_{n-1}] = P[x(t_n) \le x_n~|~x(t_{n-1})].\,\!

Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.

Mathematically, if X(t), t > 0, is a stochastic process, the Markov property states that

\mathrm{Pr}\big[X(t+h) = y \,|\, X(s) = x(s), \forall s \leq t\big] = \mathrm{Pr}\big[X(t+h) = y \,|\, X(t) = x(t)\big], \quad \forall h > 0.

Markov processes are typically termed (time-) homogeneous if

\mathrm{Pr}\big[X(t+h) = y \,|\, X(t) = x\big] = \mathrm{Pr}\big[X(h) = y \,|\, X(0) = x(0)\big], \quad \forall t, h > 0,

and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically,

Y(t) = \big\{ X(s): s \in [a(t), b(t)] \, \big\}.

If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.

An example of a non-Markovian process with a Markovian representation is a moving average time series.

ReferencesEdit

See alsoEdit

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Also on Fandom

Random Wiki