Physics Asked on March 27, 2021
Background
I was looking into the definition of temperature and how it relates to entropy and internal energy and I came across this answer on StackExchange.
According to the answer:
[Temperature is] the differential relationship between internal energy and entropy:
begin{align}
dU &= T,dS + cdots
frac{partial S}{partial U} &= frac 1T
end{align}
As energy is added to a system, its internal entropy changes.
This means that a low temperature means that a given change in internal energy $U$ leads to a large change in the entropy $S$ of a system and that a high temperature means a given change in internal energy leads to a small change in the entropy of a system.
Now, entropy is essentially a measure of the number of states of a system (which for say a box of gas molecules would be the size of the set of possible positions and velocities for all gas molecules). For a gas with low temperature the Maxwell-Boltzmann distribution for the velocity is less spread out than the distribution for the same gas with a high temperature as shown in this picture:
Since the entropy of a probability distribution measures how many states there are, I’m assuming a more spread out distribution would have a higher entropy since there are more states because the distribution spreads across a larger number of them.
Now, according to the answer’s definition of temperature, this means that a given change in the internal energy of the low-temperature box of gas would lead to a larger change in the spread of the velocity distribution (entropy) than the same change in internal energy would on the spread (entropy) of the velocity distribution of the high-temperature box of gas.
The Question
Am I interpreting correctly what a large versus a small $partial S / partial U$ (a small versus large $T$) represents in terms of the statistical distributions (in particular is my interpretation of entropy correct)? If not, how can I think about the entropy of the Maxwell-Boltzmann distribution in terms of the graph of the distribution?
Whether I am correct or not, is there is any intuition behind why a low-temperature distribution’s spread (entropy) would be more affected by a change in internal energy than the spread (entropy) of a high-temperature distribution?
To summarize I’m asking:
Why does it make sense that low temperatures correspond to a high $partial S / partial U$ and high temperatures correspond to a low $partial S / partial U$ and is there a way to think about this by looking at the shape of the Maxwell-Boltzmann distribution in the case of a box of gas?
Entropy is not actually the measure of number of states. Imagine a dice that only rolls numbers $1$ and $2$. It has $6$ faces, sure, but the other $4$ are impossible. How is this dice any different from a coin? What if the probability of the other $4$ faces are nonzero, but are very small (lets say, $0.00001$%), wouldn't the entropy still be very similar to a coin?
To relate to your question, at low $T$, you have the same number of states, but only very few are accessible. The entropy of a system with very few possible states, say a coin, is small. Now lets take very very high temperature. To model this, we will say that particles are distributed over a wide range of velocities, and the distribution is fairly flat (uniform), we can think of it as a uniform distribution with many possible states. A more spread out distribution is indeed higher entropy. Going to an even higher temperature makes it spread out only a little more, whereas going from almost $0$ to a slightly higher temperature makes it spread out a lot.
To see this, you can look at the definition of entropy, $?=−sum pln p$. For uniform distributions, all $p$ are identical, and equal to $1/N$. Thus, $S = -ln(1/?)= ln(N)$, so the derivative with respect to size is something like $1/N$. When you have very few "accessible" states, the derivative is large, and when you have many, the derivative is small. The number of accessible states $N$ is proportional to energy, and so this derivative is like the slope, which corresponds to inverse temperature $1/T$.
Correct answer by Danny Kong on March 27, 2021
Keep in mind that entropy is the logarithm of the number of states. Going from one possible state to ten results in the same increase in entropy as going from ten to a hundred. Assume, totally incorrectly, but just to illustrate the point, that increasing energy by a fixed amount results in a fixed amount of possible states being added. Then entropy would only increase with the logarithm of energy, so $partial S/partial Usim 1/U$, and thus temperature would be proportional to $U$.
Generally, high energy means high temperature if the number of possible states (or the spread of the distribution, if you want) grows slower than exponentially with the energy.
Answered by Vercassivelaunos on March 27, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP