Physics Asked on December 31, 2020
As far as I can tell temperature seems to be defined as something like average kinetic energy per molecule, but not quite. It looks like it measures something proportional to this average kinetic energy, where the coefficient of this proportion depends on the number of independent degrees of freedom by the equipartition principle.
If I put a thermometer in some substance and let it reach equilibrium to measure the temperature, then I would naively expect that the average kinetic energy per molecule in the thermometer would equal the average kinetic energy per molecule in the substance. But this means that the temperature would actually be different.
So, I conclude my naive understanding is wrong and that the average kinetic energy per degree of freedom must reach equilibrium, and if one object has more degrees of freedom it will end up with more energy. I can maybe convince myself of this intuitively by imagining a di-atomic gas and a monoatomic gas interacting.
But then since the thermometer measures something that does reach equilibrium, I suppose it must be directly measuring the average kinetic energy per degree of freedom. Is that accurate?
And if it is, how does it do that? A mercury thermometer measures temperature by thermal expansion. This seems to suggest that thermal expansion is governed by a fixed set of degrees of freedom. I would guess that it would be the translational kinetic energy (more specifically, since the mercury expands in only one dimension, I would say the translational kinetic energy in that direction). Is this an accurate description of the function of a thermometer?
Maybe it will help to read on the history of thermometers, and how temperature readings developed:
Temperature is a numerical representation of hot or cold compared against baselines, typically the point at which water freezes and boils.
....
The concept of measuring temperature is fairly new. The thermoscope — essentially a thermometer without a scale — was the precursor to the modern thermometer. There were several inventors working on thermoscopes around 1593,
...
Ferdinand II, the Grand Duke of Tuscany, followed in 1654, inventing the first enclosed thermometer, using alcohol as a liquid. But it still lacked a standardized scale and was not very accurate.
Around the same time, German physicist Daniel Gabriel Fahrenheit met Olaus Roemer, a Danish astronomer, who developed an alcohol-based thermometer using wine. He marked two points on his thermometer — 60 to mark the temperature of boiling water and 7.5 as the point where ice melted.
So until the statistical mechanical model of an ideal gas, there was no connection of temperature with particles and individual kinetic energy.
The connection with kinetic energy helps to intuitively understand the concept of thermal equilibrium, and this link may help., but the connection happened long after the invention of thermometers.
You state:
I suppose it must be directly measuring the average kinetic energy per degree of freedom. Is that accurate?
I do not think so. The kinetic theory allows us, once we know the temperature and the material, using the kinetic theory to calculate the average kinetic energy. Othewise it measures the temperature on the scale of freezin and boiling water.
Answered by anna v on December 31, 2020
As far as I can tell temperature seems to be defined as something like average kinetic energy per molecule
No. A glass of water with some ice cubes, in approximately thermal equilibrium in a refrigerator, has a temperature of $O^{circ}C$. Both the ice and the liquid water have that temperature. But the average thermal energy of the molecules of liquid water is much greater.
The points of phase changes are the best examples that energy delivered to a system (what increases the system internal energy) is not always followed by temperature increase.
Of course temperature could be redefined to match the internal energy of a system, but it will be totally different from our intutive concept of temperature.
The thermodinamical concept of temperature as $$T = frac{partial E}{partial S}$$ where $S$ is the entropy, explains better that relation.
When there is an energy input to a system and there is no phase change, the temperature increases, and that derivative is crescent.
During the phase change, internal energy keeps increasing, but the derivative (temperature) is constant. The increase of energy is linearly proportional to the increase of entropy in this case.
Answered by Claudio Saspinski on December 31, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP