Markov chains
A Markov chain consists of
n
_____
,
plus an
n
n
__________________
P.
l
At each step, we are in exactly one of
the states.
l
For
1
i,k
n,
the matrix entry
P
ik
tells
us the probability of k being the next
state, given we are currently in state
i
.
P
ik
> 0
is OK
.
i
k
P
ik