Markov chains
A Markov chain consists of n _____,
plus an nn __________________ P.
l At each step, we are in exactly one of
the states.
l For 1 i,k n, the matrix entry Pik tells
us the probability of k being the next
state, given we are currently in state i.
Pik > 0
is OK.
i
k
Pik