Markov Chains

A Markov chain is a sequence of possible state change events in which the probability of each event depends only on the state attained in the previous event.

These functions are named after Andrey Markov (1856-1922).

An Example

From the diagram and values below, going from state S1 to the next state has four possibilities with four associated probabilities: