24 Sep 2003
CS Module 7
11/28
Ergodic Markov chains
¢A Markov chain is ergodic if
lyou have a path from any state to any other
lyou can be in any state at every time step, with non-zero probability
l
l
lWith teleportation, our Markov chain is ergodic