Training Statistical Mechanics Microstates, Macrostates & Entropy
1 / 8

Microstates, Macrostates & Entropy

35 min Statistical Mechanics

Microstates, Macrostates & Entropy

Statistical mechanics bridges atomic-level physics with thermodynamics by counting accessible states. A microstate specifies all particle positions and momenta; a macrostate is defined by just a few observable quantities.

Definition

Boltzmann entropy: \(S = k_B \ln\Omega\) where \(\Omega\) is the number of equally accessible microstates consistent with the macrostate. \(k_B = 1.38\times10^{-23}\) J/K.

Key Result

The second law of thermodynamics follows statistically: systems overwhelmingly tend toward macrostates with more microstates (higher \(\Omega\)). Entropy is thus a measure of disorder or uncertainty.

Example 1

Two 2-state spins have \(\Omega(E) = \binom{N}{N_{\uparrow}}\). For \(N=100\), \(\Omega_{\max}=\binom{100}{50}\approx10^{29}\) — the most likely macrostate is 10²⁹ times more probable than all-up.

Example 2

Stirling's approximation \(\ln N! \approx N\ln N - N\) converts combinatorial entropy into the thermodynamic form \(S = -Nk_B\sum_i p_i\ln p_i\).

Loading entropy-microstate...

Practice

  1. Define entropy using Shannon's information theory and compare it to Boltzmann entropy.
  2. Why is the entropy of mixing always non-negative?
  3. Estimate the number of microstates for 1 mol of ideal gas at room temperature.
  4. What is the Gibbs paradox, and how does quantum mechanics resolve it?
Show Answer Key

1. Boltzmann: $S = k_B \ln \Omega$ counts microstates of an isolated system. Shannon: $S = -k_B \sum p_i \ln p_i$ measures missing information given a probability distribution $\{p_i\}$. For the microcanonical ensemble (all $\Omega$ states equally likely, $p_i = 1/\Omega$), Shannon reduces to Boltzmann. Shannon is more general — applies to any ensemble.

2. For two distinguishable ideal gases initially in separate volumes $V_A, V_B$: after mixing, each gas occupies $V_A+V_B$. $\Delta S = N_A k_B \ln\frac{V_A+V_B}{V_A} + N_B k_B \ln\frac{V_A+V_B}{V_B} \geq 0$ (by concavity of $\ln$). Non-negative because the mixed state has more microstates (more accessible volume per particle).

3. For $N \sim 10^{23}$ particles at $T \sim 300$ K in volume $V$: $\Omega \sim (V/\lambda^3)^N / N!$ where $\lambda = h/\sqrt{2\pi m k_B T}$ is the thermal de Broglie wavelength. Very roughly $\Omega \sim 10^{10^{25}}$ (an astronomically large number). $S \sim N k_B \sim 10$ J/K.

4. Gibbs paradox: classical entropy of mixing predicts $\Delta S > 0$ even when identical gases are mixed (merely removing a partition). Resolution: quantum mechanics requires identical particles to be indistinguishable, so we must divide phase space by $N!$ (correct Boltzmann counting). This removes the paradox: no entropy change when identical gases mix.