##### Exercise1

Consider the following $$2\times2$$ stochastic matrices.

For each, make a copy of the diagram and label each edge to indicate the probability of that transition. Then find all the steady-state vectors and describe what happens to a Markov chain defined by that matrix.

1. $$\left[\begin{array}{rr} 1 \amp 1 \\ 0 \amp 0 \\ \end{array}\right] \text{.}$$

2. $$\left[\begin{array}{rr} 0.8 \amp 1 \\ 0.2 \amp 0 \\ \end{array}\right] \text{.}$$

3. $$\left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp 1 \\ \end{array}\right] \text{.}$$

4. $$\left[\begin{array}{rr} 0.7 \amp 0.6 \\ 0.3 \amp 0.4 \\ \end{array}\right] \text{.}$$

in-context