# Glossary

## Transition matrix

Transition (or stochastic) matrix of the Markov chain is a matrix used to describe it over a finite state space S. If the probability of moving from $i$ to $j$ in one time step is $P_{ij}$, the transition matrix P is given by using $P_{ij}$ as the $i$ row and $j$ column element, e.g.,

$P = \begin{pmatrix} p_{11} & p_{12} & ... & p_{1n} \\ p_{21} & p_{22} & ... & p_{2n} \\ ... & ... & ... & ... \\ p_{n1} & p_{n2} & ... & p_{nn} \end{pmatrix}$