Table of Contents
- Classical Thermo
- Microcanonical Ensemble
- Canonical Ensemble
- Grand Canonical Ensemble
- Quantum Information Perspective
- Classical Gases
- Quantum Gases
- Debye Model of Solids
- Electronic Gas in a Magnetic Field
- Phase Transitions and Van der Waals gas
- Ising Model
- Polymers
- Brownian motion
Classical Thermo
laws:
- equilibrium is transitive (gives us idea of temperature)
- amount of work required to change isolated system's state is independent of how work is performed. for nonisolated systems, change of energy includes a heat term $\Delta E = Q + W$
- entropy increases
- (Kelvin:) no process exists whose sole effect is to extract heat from a resevoir and turn it into work
- (Clausius:) no process exists whose sole effect is to transfer heat from cold to hot
- not really as important as the others, but it's \begin{equation} \lim_{T\to 0}S(T) = 0 \end{equation}
- the ground state entropy shouldnt grow extensively
- heat capacities tend to zero
Carnot
isothermal expansion at hot temperature $\to$ adiabatic expansion (cools) $\to$ isothermal contraction at cold temperature $\to$ adiabatic contraction (heats up)
entropy
is a function of state – doing the integral in a (REVERSIBLE) loop gives you zero, otherwise we have clausius inequality
(tells us that entropy of an isolated system never decreases)
Thermodynamic potentials
at fixed energy, the entropy doesnt decrease. other extremization principles follow:
free energy is a measure of the amount of energy free to do work at a finite temperature – at constant temp and volume, free energy can never increase
at fixed temperature and pressure, minimize $G$.
Note: by extensivity, $G(p,\,T,\,N) = \mu(p,\,T)N$; cf. $\Phi = -p(T,\,\mu)V$
at fixed energy and pressure, consider enthalpy
Maxwell relations
rewrite derivatives that you dont know in terms of things you do!
when looking for something of the form
the idea is to find $A$ as a first derivative of some function of state that has $\dd{B}$ and $\dd{C}$ as differentials; this lets us swap $A$ for the $B$ derivative. more explicitly,
find a thermodynamic potential of the form $\dd{X} = A\dd{\alpha} + \beta\dd{B} + \gamma\dd{C}$. Then
as an example, consider
our function of state is
Heat capacities this does nice things for them; recalling
we find
Microcanonical Ensemble
Fixed energy $E$ gives us a notion of $S$, $T$
Canonical Ensemble
Fixed $T$ gives us an $\ev{E}$ (“softly” fixed energy by tuning $\beta$)
Boltzmann distrib:
$Z$ multiplicative for independent systems
where the last equality holds for the Boltz dist
reduces to microcanon def if $E = E_\star$ (most likely energy) $= \ev{E}$
Free energy
with particle number,
Grand Canonical Ensemble
no longer fix particle number
Entropy has the same as in CE, $k_B\partial_T(T\log \mathcal{Z})$. $E$ picks up an extra term:
grand potential
we have a pairing of intensive-extensive: $TS$ $pV$ $\mu N$, gives $E$ extensive
Quantum Information Perspective
have a density matrix instead of probability distribution:
Grand canon nice in second quant where we have ladder operators for $\hat{N}$
Classical Gases
Monatomic gas
in the monatomic case,
and we get the $N$-particle gas by $Z_N = Z_1^N = V^N\lambda^{-3N}$
Ideal gas law EoS from $p=-\partial_VF$
equipartition: for each kinetic DoF we have $E\mathbin{+\kern-0.5ex=} \frac{1}{2}k_BT$, (3D = 3$N$ DoF)
note: need to account for indistinguishability in the ideal gas partition function:
(sackur-tetrode equation)
adding in a chemical potential, (remember to sum over all $N$ – gives an exp)
maxwell-boltz distrib (from viewing $Z_1$ as sum over states of probability):
gives velocity distribution of a classical gas
Diatomic gas
get these new $Z$s from a phase space integral for the various parts of the hammy
oscillation “freezes out” first, then rotation – limitations of classical equipartition theory (also think about how a deep potential well gives same mechanics as rigid connection, but different degrees of freedom counting. we need the full quantum explanation)
Interacting Gas
virial expansion
define the mayer f function
allows us to rewrite partition
and we find the pressure is
at which point we must pick a $U$ and perform the $f$ integral. typical choice:
which gives
at low density and high temperatures for parameters
higher corrections by cluster expansion
Quantum Gases
DENSITY OF STATES: “if instead of integrating over states, i want to integrate over energies, what do i need as a prefactor?”
for the usual dispersion relation
or relativistic
photon Gas
photons: idea is to have $Z_\omega$ for each frequency, sum over occupation:
giving
whence we find the Planck distribution of energy,
and wien’s law, $\omega_\text{max} \sim 1/\beta\hbar$. we also get stefan-boltz,
free energy gives us pressure, entropy, heat capacity
Bose Gas
only makes sense when $\mu < 0$, or fugacity $z = e^{\beta\mu} \in (0,\,1)$
doing the usual,
where we integrate the log using an IBP: $\dd{E}g(E) \sim \dd{(E^{3/2})} \sim \dd{(Eg(E))}$
high-temp (small $z$) expansion of density:
gives equation of state
bosons reduce pressure!
BECs
our $\int\dd{E}\sqrt{E}$ kills $E =0$ states when we try to sum over momenta; manually add in
($g$ is a polylog – numerical integration factor. $g_n(1) = \zeta(n)$). Fix parameters st
which lets $\rho_\text{gs}$ make up for the difference; leads to the above expression for $N$ so long as $\rho\geq \rho_c = \lambda^{-3}\zeta(3/2)$ (“critical density”). below this density, $\mu < 0$ strictly and we have the usual bose gas form. at and above, however, $\mu = 0$ and we get ground state occupancy
GS occupancy has
for $T_c$ the temp when $z=1$. let’s see $C_V$:
after a lot of approximations. $C_V$ continuous but its derivative is not – first order pt
Fermi Gas
no restrictions on $\mu$ anymore. $g(E)$ carries spin degeneracy $g_s = 2s+1$
and we have the usual
with the small $z$ EoS
fermions increase the pressure (by the same factor!)
in the $T\to 0$ limit, we have states filled until the fermi energy $E_F=\mu(T=0)$ — though $\mu$ isnt really a function of $T$, the condition on keeping $N$ fixed allows us to write one in terms of the other (write $N$ as integral up to the surface)
and we can compute
which is a nonzero “degeneracy” pressure at $T=0$
in $T\ll T_F$, we can take the integrals to infinity instead of cutting them off. Only states within $k_BT$ of the fermi surface are affected by the temperature, so we can evaluate derivatives of the distribution at $E_F$; this is the only place it changes.
(idea: we have $g(E_F)k_BT$ particles contributing to the physics, each of which has $E\sim k_BT$ – this gives linear heat capacity)
we often combine this linear electronic contribution with the cubic phononic contribution (see here) to get the full heat capacity of metals.
to do this low temp expansion rigorously, we sommerfeld expand some polylogs
the expansion tells us the low-temp expansion in $1/\log(z) = 1/\beta\mu$
whence we can find
and get the heat capacity above.
Diatomic gas
rotation: (recall $2j+1$ degeneracy, sum over all $j$)
vibration:
where the low $T$ gives zero-point energy of QHO and doesnt contribute to $C_V$
Debye Model of Solids
basically just follows from a linear dispersion (and polarization degeneracy)
integrals taken up to a cutoff frequency $\omega_D$. To determine the cutoff, consider
which gives
and we can find energy and heat capacity the usual ways.
in low temp limit, integrate to infinity; in high temp limit expand integrand
Electronic Gas in a Magnetic Field
pauli paramagnetism
effect from spin coupling to $B$:
we can compute high temp ($z\sim 0$) magnetization
and susceptibility
at low temps, use expansion of $f_n(z)$ to find
idea: only the $g(E_F)$ electrons on the surface are free to flip
Landau diamagnetism
effect from lorentz force (taking $B$ in the $+z$ direction)
solving the eigenvalue problem says energy states come in landau levels
which have degeneracy
we proceed to compute the magnetism
using the partition function
this is comparable to pauli but of an opposite sign.
Phase Transitions and Van der Waals gas
isotherms have that weird wiggle in a $p-v$ diagram below the critical temperature: thus the transition is marked by
below the critical point, we have weird compressibility and it’s broken: we use maxwell’s “lol just draw a straight line then” perscription (which comes from setting liquid and gas in chemical equilibrium, $\mu_\ell = \mu_g$ – can also equate GFE per particle)
clausius-clapeyron equation from looking at $p-T$ graph. coexistence region from $p-v$ squeezed into a line (think about traversing an isobar in the $p-v$ diagram and what it means in $p-T$ space). equality of gibbs gives
where we’ve defined the specific latent heat
this applies to any first-order transition; here we have
as our first-derivative discontinuities
note that $S \text{ discontinuous} \implies C\sim \partial_T S$ goes to infinity – the temperature doesnt change as we pour heat into the system
We can solve the CC equation with a few assumptions (ideal gas, $v_g\gg v_\ell$, $L$ constant).
really slick way of getting the critical point: start by rearrangign VdW:
the critical point is defined by $\partial_vp=\partial^2_vp = 0$, so at the critical temperature, we only have this cubic term:
and we can compare term by term in $v$ to get
another handy way of rewriting vdw is in terms of reduced variables; we divide by the critical value, and the equation takes the form
which is the path toward the critical exponents
Ising Model
where we’re interested in
mean field approximation: write spins in terms of deviation from average and assume that fluctuations are small
so the energy becomes
and we find, since each spin acts independently,
zero magnetic field
when $\beta Jq < 1$ the only solution for $m$ is $m=0$: there is no average magnetization at high temperatures. if the temperature is low enough, however, we have an unstable solution at $m=0$ and two stable solutions at $m=\pm m_0$, and in the limit of zero temp $m\to\pm 1$ (all spins aligned). as we vary $T$, we have a singularity in $\partial_T m$:
second order transition as we vary $T$
(note: high temperature expansion gets into some stat field theory and RG stuff – possibly important to know?)
nonzero magnetic field
there is no longer a phase transition for a fixed $B$ as $T$ varies: at large temps, magnetization to zero as
and at small temps all spins align with the $B$ field (no choice to make). drawing an $m-T$ graph shows how turning on $B$ separates and smooths out what was a singularity in the $B=0$ case. however, if we vary $B$ and swap its direction, the magnetization (a first derivative) jumps discontinuously:
first order transition as we vary $B$ from negative to positive and $T<T_C = Jq/k_B$
the critical exponents we get
are the same as for VdW
Polymers
Simplest model: the polymer as a random walk. You get a binomial distribution, which approaches a gaussian in the large $N$ limit. in 1D,
for $d$ dimensions each $\sigma_x^2$ gets divided by $d$ since the total $\sigma^2$ is basically the sums of the individual dimensional walks – in each dimension you only have to walk $1/d$ of the way there (this is not valid reasoning but it’s a way to remember it)
Microcanonical perspective fix $X$ and calculate $F(X)$
so if we can find the entropy we can take a derivative to find $F(X)$. The number of states that have a length $X$ is just $N\cdot P(x)$, so entropy comes immediately from the gaussian above.
Canonical perspective fixing $F$ to calculate $X$
we note that $X = \sum a\cos\theta_i$, so we can just use $E = -FX$
Brownian motion
idea: large particle of radius $b$ suspended in a fluid. we have a stokes law velocity-dependent damping force and some random force that is time-uncorrelated, ie $\ev{F(t)F(t’)} = c\var{(t-t’)}$
can reduce the order and get solutions
where we can now pick out a $v(t)$ by integrating $\dot{A}$ back again
relating our random variable $F$ to a new random variable $v$. For a fixed $T$, we can determine the $c$ in $\ev{FF}$: we calculate $\ev{v^2}$ both according to this description of $v$ and from taking a boltzmann (canonical ensemble) probability distribution.