###### Exercise 1

Consider the following \(2\times2\) stochastic matrices.

For each, make a copy of the diagram and label each edge to indicate the probability of that transition. Then find all the steady-state vectors and describe what happens to a Markov chain defined by that matrix.

\(\left[\begin{array}{rr} 1 \amp 1 \\ 0 \amp 0 \\ \end{array}\right] \text{.}\)

\(\left[\begin{array}{rr} 0.8 \amp 1 \\ 0.2 \amp 0 \\ \end{array}\right] \text{.}\)

\(\left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp 1 \\ \end{array}\right] \text{.}\)

\(\left[\begin{array}{rr} 0.7 \amp 0.6 \\ 0.3 \amp 0.4 \\ \end{array}\right] \text{.}\)