Markov chain/process

The simplest form of Markov model. A series of connected observable states (i.e., a chain) with specific probabilities of transition between the states.

Example of Markov chain: