43/28
Markov chains
¢A Markov chain consists of n states, plus an n´n transition probability matrix P.
lAt each step, we are in exactly one of the states.
lFor 1 £ i,k £ n, the matrix entry Pik tells us the probability of k being the next state, given we are currently in state i.
i
k
Pik
Pik > 0
is OK.