Markov Chains: Discrete Time
Markov Chains: Discrete Time
A discrete-time Markov chain (DTMC) is a sequence $\{X_n\}_{n\ge 0}$ taking values in a countable state space $S$ with the Markov property: the future depends on the past only through the present. This memoryless structure makes Markov chains analytically tractable and widely applicable — in queuing theory, genetics, finance, and algorithm analysis.
The chain is governed by a transition matrix $P$ with $P_{ij}=P(X_{n+1}=j|X_n=i)\ge 0$ and $\sum_j P_{ij}=1$. The $n$-step transition probability satisfies $P^{(n)}=P^n$ by the Chapman-Kolmogorov equations.
Markov Property & Transition Matrix
$$P(X_{n+1}=j\mid X_0,\ldots,X_n)=P(X_{n+1}=j\mid X_n)=P_{X_n,j}.$$The row-stochastic matrix $P$ satisfies $P_{ij}\ge 0$, $\sum_j P_{ij}=1$. The distribution at step $n$ is $\mu^{(n)}=\mu^{(0)}P^n$.
Perron-Frobenius & Stationary Distribution
For an irreducible, aperiodic chain on a finite state space, $P^n\to\Pi$ as $n\to\infty$, where each row of $\Pi$ equals the unique stationary distribution $\pi$ satisfying $$\pi P = \pi, \quad \sum_i \pi_i = 1.$$ For a reversible chain, detailed balance $\pi_i P_{ij}=\pi_j P_{ji}$ implies stationarity.
Example 1 — Two-State Chain
States $\{0,1\}$ with $P=\begin{pmatrix}1-\alpha&\alpha\\\beta&1-\beta\end{pmatrix}$. Solve $\pi P=\pi$: $$\pi_0=\frac{\beta}{\alpha+\beta},\quad \pi_1=\frac{\alpha}{\alpha+\beta}.$$ With $\alpha=0.3$, $\beta=0.5$: $\pi=(0.625,\,0.375)$.
Example 2 — Random Walk on $\mathbb{Z}$
$P_{i,i+1}=p$, $P_{i,i-1}=q=1-p$. The walk is recurrent iff $p=1/2$. The generating function of the first-return time satisfies a quadratic, giving extinction probability $1$ when $p\le 1/2$ and $(q/p)^k$ from state $k$ when $p>1/2$.
Practice
- Find the stationary distribution of $P=\begin{pmatrix}0.6&0.4\\0.2&0.8\end{pmatrix}$.
- Compute $P^2$ for the matrix above and verify Chapman-Kolmogorov.
- Define periodicity of a state and give an example of a periodic chain.
Show Answer Key
1. $\pi P=\pi$ with $\pi_1+\pi_2=1$ gives $0.6\pi_1+0.2\pi_2=\pi_1$, so $\pi_2=2\pi_1$, hence $\boldsymbol{\pi}=(\tfrac{1}{3},\,\tfrac{2}{3})$.
2. $P^2=\begin{pmatrix}0.44&0.56\\0.28&0.72\end{pmatrix}$. Chapman-Kolmogorov: $(P^2)_{ij}=\sum_k P_{ik}P_{kj}$, verified by direct multiplication.
3. State $i$ has period $d$ if $d=\gcd\{n\ge1:P_{ii}^{(n)}>0\}$. Example: a random walk on a bipartite graph (e.g., $\{0,1\}$ with transitions $0\to1$ and $1\to0$ only) has period 2.