Glossary

Markov chain

A Markov chain is a random probabilistic process dictating transitions between "states". The chain is viewed as a collection of discrete time steps, with the probability of entering a state being determined completely by the current state. As a result, Markov processes are characterized by being memoryless a property that can be defined formally as follows:

$\Pr(X_{n+1}=x|X_1=x_1, X_2=x_2, \ldots, X_n=x_n) = \Pr(X_{n+1}=x|X_n=x_n).\,$ In other words, system state $X_{n+1}$ depends only on state $X_{n}$.

A common way to fully describe Markov chains is by using a transition matrix (a.k.a. stochastic matrix):

$ P = \begin{pmatrix} p_{11} & p_{12} & ... & p_{1n} \\ p_{21} & p_{22} & ... & p_{2n} \\ ... & ... & ... & ... \\ p_{n1} & p_{n2} & ... & p_{nn} \end{pmatrix} $

where $P_{ij}$ is the probability of transitioning from state $i$ to state $j$.

Wikipedia