TransWikia.com

What is a fundamental force behind entropy?

Physics Asked by acrc on March 17, 2021

There are four fundamental forces in the universe. They are behind every interaction in the nature. For instance, the electromagnetic force is behind friction. I wonder which fundamental force is behind entropy.

2 Answers

Small-scale example

Toss four coins. Each lands as either heads $mathrm H$ or tails $mathrm T$. The possible outcomes are:

$$begin{aligned} text{All heads: }quad&mathrm Hmathrm Hmathrm Hmathrm H text{Three heads: }quad&mathrm Hmathrm Hmathrm Hmathrm T- mathrm Hmathrm Hmathrm Tmathrm H- mathrm Hmathrm Tmathrm Hmathrm H- mathrm Tmathrm Hmathrm Hmathrm H text{Two heads: }quad&mathrm Hmathrm Tmathrm Tmathrm H- mathrm Hmathrm Tmathrm Hmathrm T- mathrm Tmathrm Hmathrm Hmathrm T- mathrm Hmathrm Tmathrm Tmathrm H- mathrm Tmathrm Hmathrm Tmathrm H- mathrm Tmathrm Tmathrm Hmathrm H text{One head: }quad&mathrm Hmathrm Tmathrm Tmathrm T- mathrm Tmathrm Hmathrm Tmathrm T- mathrm Tmathrm Tmathrm Hmathrm T- mathrm Tmathrm Tmathrm Tmathrm H text{No heads: }quad&mathrm Tmathrm Tmathrm Tmathrm T end{aligned} $$

Each set (such as $mathrm Tmathrm Hmathrm Tmathrm T$ or $mathrm Tmathrm Hmathrm Hmathrm H$) is what we call a microstate. Each overall outcome (such as All heads or two heads) is called a macrostate.

Since all microstates are equally likely, the most probable macrostate is clearly Two heads.

But at the same time the Two-heads outcome is the most chaotic one. Meaning, we know the least about the underlying microstate, because it could be any of the possible ones. With All heads we know exactly what microstate that lies underneath - there is no chaos there. All in all, the most probable outcome will be the outcome that we know the least about - the most probable outcome is the most chaotic outcome.

No particular "force" caused this to be. Nevertheless this is reality. Most likely you will see Two heads as the outcome from a random four-coin toss; most likely you will see maximised chaos appearing.

This is how to consider entropy. Entropy is a measure invented to quantify this behaviour of our world - it is a "chaos factor". If you tossed a million sets of four coins, you will surely get mostly Two heads because that is most probable, the outcome with the most microstates - you will see chaos being maximised. Entropy is directly calculated from the number of real microstates $z$ (assuming equal probability of each) like this:

$$S=k_Bln(z)$$

$k_B$ is Boltzmann's constant. The problem is typically that we can't know the exact number of microstates in usual systems. But we can know other things about the entropy, for instance that it always tends to increase for a closed system.

Large-scale example

For a better real-life example, scaling it up from a few coins to countless molecules, if you empty a glass of perfume in the corner of the room, initially you now have a high concentration in that corner and low or no concentration in the rest of the room. Pick a molecule at random from the corner and you can expect it to be perfume; pick one from the rest of the room and you can expect it not to be. You know a lot about the air. No chaos, not a chaotic system.

But over time the perfume will eventually disperse and spread to all parts of the room. If you then pick a random molecule you are less sure of what it is - you know less about this system; more chaos. The expected end-state is of course maximised mixing - which corresponds to maximised chaos (we know the least about the system in this state).

Calculate the entropy in this system and you will see it starting low when chaos is low and ending high when full mixing and maximum chaos has been achieved. This is what entropy is; a measure of chaos. And any system will overall naturally move towards more chaos (more mixing; most problable macrostate), not towards less (the probability for the perfume molecules to gather together in one concentrated area is very, very low).

This is what we call the second law of thermodynamics, that systems tend towards higher entropy, more chaos. There is no physical law preventing it from doing the opposite, but probability considerations and observation tells us that this must be so. And no particular force or interaction causes it. In the four-coins example you could point out forces from hand and table, from air drag and friction. In the perfume example you could point out diffusion forces and similar. There is not a particular specific force or interaction connected to entropy. It varies from case to case.

Answered by Steeven on March 17, 2021

There are two possible meanings of entropic force. One is quite generic and refers to the generic concept of entropy as causing phenomena, the other is more technical and refers to some phenomena observable in general in asymmetric solutions.

I am not touching the former concept. It is too generic and would open the door to introducing many other "forces" of this kind. In thermodynamics, the maximum entropy principle is only valid for an isolated system, while other extremum principles are the correct ones for different boundary conditions. We should introduce different new forces just depending on the boundary conditions applied to the system. I'll confine my answer to the more specific meaning of entropic force in condensed matter.

Even though people speak about entropic force, as clearly stated at the beginning of the corresponding entry in Wikipedia, it is an emergent phenomenon not to be confused with the fundamental forces.

By fundamental forces in Physics, we mean the forces, or, even better, the interactions, acting between what we consider the elementary building blocks of the real world. Entropic forces in condensed matter appear only as an effect of the presence of many molecules, and the naming of entropic force is just referring to the effect more than the cause of the interactions.

Answered by GiorgioP on March 17, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP