45
/28
Ergodic Markov chains
¢
A Markov chain is ergodic if
l
you have a path from any state to any other
l
you can be in any state at every time step,
with non-zero probability
l
l
l
With teleportation, our Markov chain is
ergodic
Not
ergodic