TransWikia.com

Proving that the Boltzmann entropy is equal to the thermodynamic entropy

Physics Asked by Wade Hodson on August 3, 2021

I’ve been trying to understand how we can equate the Boltzmann entropy $k_B ln Omega$ and the entropy from thermodynamics. I’m following the approach found in the first chapter in Pathria’s statistical mechanics, and in many other texts. Many other questions on stackexchange come close to addressing this problem, but I don’t think any of the answers get at my specific question.

So, we’re considering two isolated systems 1 and 2, which are brought into thermal contact and allowed to exchange energy (let’s assume for simplicity that they can only exchange energy). On the thermodynamic side of the problem, we have the necessary and sufficient condition for thermal equilibrium

$$T_1=frac{partial E_1}{partial S_1}=T_2=frac{partial E_2}{partial S_2},$$

where the temperatures $T_1$ and $T_2$, the internal energies $E_1$ and $E_2$, and the entropies $S_1$ and $S_2$ are all defined appropriately in operational, thermodynamic terms. On the other hand, we can show that the necessary and sufficient condition for equilibrium from the standpoint of statistical mechanics is given by

$$beta_1 equiv frac{partial ln Omega_1}{partial E_1}= beta_2 equiv frac{partial ln Omega_2}{partial E_2}.$$

Here, $Omega_1$ and $Omega_2$ are the number of microstates associated with the macrostate of each system. Now, since both of these relations are necessary and sufficient for equilibrium, one equality holds if and only if the other also holds. My question is: How can we proceed from here to show that $S=k_B ln Omega$, without limiting our scope to specific examples (like an ideal gas)? In Pathria’s text and in other treatments, I don’t see much explanation for how this step is justified.

My possibly wrong thoughts are: It seems like we first need to show that $beta$ is a function of $T$ alone (and indeed the same function of $T$ for both systems), and then show that the form of this function is in fact $beta propto T^{-1}$. But I’m not sure how to prove either of those claims.

5 Answers

As discussed in the comments, your proof needs to show that: $$ beta = frac{1}{kT} $$ Following Gaskell's "Introduction to Thermodynamics", I think that this is a definition. The rationale comes from looking at $beta$ as a parameter which controls the shape of the distribution of the Boltzmann distribution of energy among particles: $$ n_{i}=frac{ne^{-beta E_i}}{P} $$ where $n$ is the total number of particles, $E_i$ is the $i^{th}$ energy level, and $n_i$ is the occupation of the $i^{th}$ energy level, and $P$ is the partition function.

Having $beta$ and $T$ inversely proportional makes sense because, as the plot of occupation vs energy below shows, you would expect the higher energy states to become more occupied when the temperature is raised. This happens when beta is lowered.

enter image description here

Answered by kdaquila on August 3, 2021

$newcommand{mean}[1] {left< #1 right>}$ $DeclareMathOperator{D}{d!}$

Proof that $beta = frac{1}{k T}$ for the canonical ensemble

This proof assumes only classical thermodynamics and the Boltzmann distribution of microstates. It does not assume anything about statistical entropy.

First recall the statistical mechanics expressions for the pressure $P$ and internal energy $E$ of a system

begin{align} label{eq:sm_pressure} P = mean{P} = frac{1}{beta} left( frac{partial ln Z}{partial V} right)_{beta, N} end{align}

begin{align} label{eq:sm_energy} E = mean{E} = -left( frac{partial ln Z}{partial beta} right)_{V, N} end{align}

where $Z$ is the partition function. These can both be derived from the Boltzmann distribution of microstates.

Lets take the partial derivative of the pressure with respect to $beta$, holding $V$ and $N$ constant.

begin{align} left( frac{partial P}{partial beta} right)_{N} &= frac{1}{beta} left( frac{partial^2 ln Z}{partial beta ~partial V } right)_{N} -frac{1}{beta^2} left( frac{partial ln Z}{partial V} right)_{beta, N} &= frac{1}{beta} left( frac{partial^2 ln Z}{partial beta ~partial V } right)_{N} -frac{1}{beta} mean{P} end{align}

Now we take the partial derivative of the energy with respect to volume $V$, holding $beta$ and $N$ constant.

begin{align} left( frac{partial E}{partial V} right)_{beta, N} &= -left( frac{partial^2 ln Z}{partial beta ~partial V} right)_{N} end{align}

Combining these two partial derivatives gives

begin{align} label{eq:sm_pressure_eq} -mean{P} &= left( frac{partial E}{partial V} right)_{beta, N} + beta left( frac{partial P}{partial beta} right)_{N,V} end{align}

This equation, which was derived completely from statistical mechanics assumptions, can be compared with a famous analogous equation from classical thermodynamics "the thermodynamic equation of state":

begin{align} label{eq:ct_pressure_eq} -P &= left( frac{partial E}{partial V} right)_{T, N} - T left( frac{partial P}{partial T} right)_{N,V} end{align}

Because $mean{P} = P$, we can combine these two equations:

begin{align} left( frac{partial E}{partial V} right)_{beta, N} + beta left( frac{partial P}{partial beta} right)_{N,V} &= left( frac{partial E}{partial V} right)_{T, N} - T left( frac{partial P}{partial T} right)_{N,V} label{eq:eq1} left( frac{partial E}{partial V} right)_{beta, N} + left( frac{partial P}{partial ln beta} right)_{N,V} &= left( frac{partial E}{partial V} right)_{T, N} - left( frac{partial P}{partial ln T} right)_{N,V} qquad textrm{because $D ln x = frac{D x}{x}$} end{align}

Therefore, the left and right-hand sides are equal, and they are equal only when

begin{align} D ln beta &= - D ln T end{align}

Integrating both sides:

begin{align} ln beta &= - ln T -ln k beta &= frac{1}{k T} end{align}

Because $beta$ is independent of composition, the proportionality constant $k$ is a universal constant which is also independent of composition.

Answered by ratsalad on August 3, 2021

To go from statistical mechanics to thermodynamics we assume that the quantity $frac{partial E}{partial S}$ is equal to Inverse of temperature. Talking about Boltzmann's relation it can be verified by considering case of coins.

Answered by user261139 on August 3, 2021

$newcommand{mean}[1] {left< #1 right>}$ $DeclareMathOperator{D}{d!}$ $DeclareMathOperator{pr}{p}$

Proof that $beta = frac{1}{k T}$ and that $S = k ln Omega$

This proof follows from only classical thermodynamics and the microcanonical ensemble. It makes no assumptions about the analytic form of statistical entropy, and does not involve the ideal gas law.

First recall that the pressure of an individual microstate is given from mechanics as:

begin{align} P_i &= -frac{D E_i}{D V} end{align}

When assuming only $P$-$V$ mechanical work, the energy of a microstate $E_i(N,V)$ is only dependent on two variables, $N$ and $V$. For example, consider a quantum mechanical system like particles confined in a box. Therefore, at constant composition $N$,

begin{align} P_i &= -left( frac{partial E_i}{partial V} right)_N end{align}

In a system described by the microcanonical ensemble, there are $Omega$ possible microstates of the system. The energy of an individual microstate $E_i$ is likewise trivially independent of the number microstates $Omega$ in the ensemble. Therefore, the pressure of an individual microstate can also be expressed as

begin{align} P_i &= -left( frac{partial E_i}{partial V} right)_{Omega,N} end{align}

According to statistical mechanics, the macroscopic pressure of a system is given by the statistical average of the pressures of the individual microstates:

begin{align} P = mean{P} &= sum_i^Omega pr_i P_i end{align}

where $pr_i$ is the equilibrium probability of microstate $i$. For a microcanonical ensemble, all microstates have the same energy $E_i = E$, where $E$ is the energy of the system. Therefore, from the fundamental assumption of statistical mechanics, all microcanonical microstates have the same probability at equilibrium

begin{align} pr_i = frac{1}{Omega} end{align}

It follows that the pressure of a microcanonical system is given by

begin{align} P = mean{P} &= -sum_i^Omega frac{1}{Omega} left( frac{partial E_i}{partial V} right)_{Omega,N} &= -frac{1}{Omega} sum_i^Omega left( frac{partial E}{partial V} right)_{Omega,N} &= -frac{Omega}{Omega} left( frac{partial E}{partial V} right)_{Omega,N} P &= -left( frac{partial E}{partial V} right)_{Omega,N} end{align}

This expression for the pressure of a microcanonical system can be compared to the classical expression

begin{align} P &= -left( frac{partial E}{partial V} right)_{S,N} end{align}

which immediately suggests a functional relationship between entropy $S$ and $Omega$.

Now we take the total differential of the energy of a microcanonical system:

begin{align} D E = left(frac{partial E}{partial ln Omega}right)_{V, N} D lnOmega + left(frac{partial E}{partial V}right)_{ln Omega, N} D V end{align}

As stated in the OP, for the microcanonical ensemble, the condition for thermal equilibrium is:

begin{align} beta &= left( frac{partial ln Omega}{partial E} right)_{V,N} end{align}

Thus,

begin{align} D E = frac{1}{beta} D lnOmega - P D V end{align}

Compare with the classical first law of thermodynamics:

begin{align} D E = T D S - P D V end{align}

Because these equations are equal, we see that

begin{align} T D S &= frac{1}{beta} D lnOmega D S &= frac{1}{T beta} D lnOmega end{align}

Note that both $D S$ and $D lnOmega$ are exact differentials, so $frac{1}{T beta}$ must be either a constant or a function of $Omega$. Since $D S$ and $D lnOmega$ are both extensive quantities, $frac{1}{T beta}$ cannot depend on $Omega$, and therefore

begin{align} k &= frac{1}{T beta} beta &= frac{1}{k T} end{align}

where $k$ is a universal constant that is independent of composition, since $beta$ and $T$ are both independent of composition.

By integrating, we have

begin{align} S &= k lnOmega + C end{align}

where $C$ is a constant that is independent of $E$ and $V$, but may depend on $N$. By invoking the third law we can set $C=0$ to arrive at the famous Boltzmann expression for the entropy of a microcanonical system:

begin{align} S &= k lnOmega end{align}

Answered by ratsalad on August 3, 2021

There is no well-defined "thermodynamic entropy" outside of the Shannon or Von Neumann entropy because there is no well-defined concept of temperature at all without entropy. Entropy is foundational, and temperature is derived from it. And entropy is fundamentally statistical in nature.

In fact, it is entropy that should have its own base unit, with temperature having a derived unit. If entropy had a unit, $B$, then temperature would have the unit $J/B$.

Answered by enigmaticPhysicist on August 3, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP