TransWikia.com

Is a spontaneous decrease in entropy *impossible* or just extremely unlikely?

Physics Asked by Andy4983948 on September 28, 2020

I was reading this article from Ethan Siegel and I got some doubts about a sentence about entropy, specifically when Ethan explains the irreversibility of the conditions of the hot-and-cold room, as in this figure:

entropy

In his words:

It’s like taking a room with a divider down the middle, where one side is hot and the other is cold, removing the divider, and watching the gas molecules fly around. In the absence of any other inputs, the two halves of the room will mix and equilibrate, reaching the same temperature. No matter what you did to those particles, including reverse all of their momenta, they’d never reach the half-hot and half-cold state ever again.

My question is:

Is the spontaneous evolution from the equilibrium temperature (right side of the image) to the half-hot and half-cold state (left side) physically and theoretically impossible/forbidden, or is it simply so astronomically unlikely (from a statistical perspective) that in reality it never happens? The article seems to suggest the former, but I was under the impression of the latter.

9 Answers

The appropriate mathematical tool to understand this kind of question, and more particularly Dale's and buddy's answers, is large deviation theory. To quote wikipedia, "large deviations theory concerns itself with the exponential decline of the probability measures of certain kinds of extreme or tail events". In this context, "exponential decline" means: probability that decreases exponentially fast with the increase of number of particles.
TL;DR: it can be shown that the probability to observe an evolution path for a system that decreases entropy is non-zero, and it decreases exponentially fast with the number of particles; thanks to a statistical mechanics of "trajectories", based on large deviation theory.

Equilibrium statistics

In equilibrium statistical mechanics, working in the appropriate thermodynamical ensemble, for instance the microcanonical ensemble in this case, one could relate the probability to observe a macrostate $M_N$ for the $N$ particles in the system, to the entropy of the macrostate $S[M_N]$: $mathbf{P}_{eq}left(M_Nright)proptotext{e}^{Nfrac{mathcal{S}[M_N]}{k_{B}T}}.$ Naturally, the most probably observed macrostate, is the equilibrium state, the one which maximizes the entropy. And the probability to observe macrostates that are not the equilibrium state decreases exponentially fast as the number of particles goes to infinity, this is why we can see it as a large deviation result, in the large particle numbers limit.

Dynamical fluctuations

Using large deviation theory, we can extend this equilibrium point of view: based on the statistics of the macrostates, to a dynamical perspective based on the statistics of the trajectories. Let me explain.

In your case, you would expect to observe the macrostate of your system $(M_N(t))_{0leq tleq T}$, evolving on a time interval $[0,T]$ from an initial configuration $M_N(0)$ with entropy $S_0$ to a final configuration $M_N(T)$ with entropy $S_T$ such as $S_0 leq S_T$, $S_T$ being the maximal entropy characterizing the equilibrium distribution, and the entropy of the macrostate at a time $t$, $S_t$ being a monotonous increasing function (H-Theorem for the kinetic theory of a dilute gas, for instance).

However, as long as the number of particles is finite (even if it is very large), it is possible to observe different evolutions, particularly if you wait for a very long time, assuming your system is ergodic for instance. By long, I mean large with respect to the number of particles. In particular, it has been recently established that one could formulate a dynamical large deviation result which characterizes the probability of any evolution path for the macrostate of the system (https://arxiv.org/abs/2002.10398). This result allows to evaluate for large but finite number of particles, the probability to observe any evolution path of the macrostate $(M_N(t))_{0leq tleq T}$, including evolution paths such as $S_t$, the entropy of the system a time $t$ is non monotonous. This probability will become exponentially small with the number of particles, and the most probable evolution, that increases entropy, will have an exponentially overwhelming probability as the number of particles goes to infinity.

Obviously, for a classical gas, N is very large, such evolution paths that do not increase entropy won't be observed: you would have to wait longer than the age of the universe to observe your system doing this. But one could imagine systems where we use statistical mechanics, where $N$ is large but not enough to "erase" dynamical fluctuations: biological systems, or astrophysical systems for instance, in which it is crucial to quantify fluctuations from the entropic fate.

Correct answer by ErgodicRoller on September 28, 2020

Well, there was a thought experiment by Maxwell (known as Maxwell's Demon), in which if one knows about exact information of all the particles in both the compartment then he/she cant timely open the partition so as to let the particle(s) with high energy on one side and leave the particles with low energies on the other. Now doing it all and having exact information about all the particles is next to impossible, let's assume if one could do so it won't be spontaneous.

Now talking about the probability of that event happening, imagine you flip a coin 10000 times what do you expect regarding the result ie. number of tails vs no. of heads, as law of large no. states it will be close to 50-50 so it's highly unlikely that you'll get 9999 heads and a tale.

Returning to you question there are molecules of the order $10^{26}$ for a just a mole of gas and with that amount of molecules, for the molecules to separate you need only one kind of the particle to pass through the partition hence you can think of how unlikely the event is when you cannot get 9999 tails from just 10000 flips (the coin experiment is just an analogy you can assume that a tails is a particle with high energy and heads a particle with low energy or vice versa going through the partition, also I have assumed the fact that collisions didn't occur to keep their velocities as same as before which is also impossible).

So yes it is astronomically unlikely.

Answered by buddy001 on September 28, 2020

What you are interested in is Crook’s fluctuation theorem. It gives the probability of going “backwards” thermodynamically. Specifically, the theorem says:

$$frac{P(Arightarrow B)}{P(Aleftarrow B)}=exp left( frac{1}{k_B T}(W_{Arightarrow B}-Delta F) right)$$

In the case of the box, $W_{Arightarrow B}=0$ so the probability is purely driven by the change in Helmholtz free energy, $Delta F$.

Answered by Dale on September 28, 2020

Noticing that Shannon information entropy is related to thermodynamic entropy like this:

$$ S = k_B H $$

One can express the quantum entropic uncertainty principle for thermodynamic entropies:

$$ S_a + S_bgeq k_Blogleft(frac e2right) $$

Where $S_a, S_b$ is temporal and spectral thermodynamic entropies. This shows that entropies can fluctuate in time and spectra. It's not forbidden for entropy fluctuation going backwards, but likely this will be on short time scales and within small partitions of the whole system. And probably backwards entropy fluctuations will be canceled later some time by standard time arrow fluctuations. So not much useful information can be extracted from backwards fluctuations because in principle they are uncontrolable.

Also Bohr suggested a thermodynamic uncertainty relation: $$ {mathrm{Delta }}beta ge frac{1}{{{mathrm{Delta }}U}} $$

Where $beta = (k_BT)^{-1}$ is inverse temperature. This relationship means that if you know the system internal energy very precisely, then you don't know anything about its temperature and vise-versa. Now imagine that after molecules diffusion in part A you measure the temperature exactly and the exact internal energy of the B part. Then according to the uncertainty principle it can be that this measurement resulted in half-hot / half-cold molecule partition formation. But, this implies that the measurement has performed some kind of thermodynamic work, so this has nothing to do with spontaneous backwards entropy change and thus falls out of the question formulated by the OP. But still I think it's interesting to think about such kind of possibility, because the act of measurement is vaguely defined and may happen without human intervention.

Answered by Agnius Vasiliauskas on September 28, 2020

Entropy is the measure of how spread out energy is compared to the maximum amount it could be spread out. The mathematics show that the predicted increase in the entropy of the universe (the second law of thermodynamics) is a result of the statistical probability that energy will trend toward a more spread out (vs. concentrated) state.

Although this process seems irreversible, statistically it is also inevitable, over a long enough time span, that the energy of universe will, by the same probability-based reasoning, redistribute to a minimum energy configuration (or most highly concentrated state). This probability is so low it is almost impossible to describe except to say that it is not infinitely unlikely, and therefore eventually it will occur.

Interestingly, one of the greatest living physicist, Roger Penrose, has argued that there is a huge mystery in cosmology related to entropy, namely that there is no explanation for how the initial very-low entropy state of the universe could have occurred.

Answered by John Fletcher on September 28, 2020

Is the spontaneous evolution from the equilibrium temperature (right side of the image) to the half-hot and half-cold state (left side) physically and theoretically impossible/forbidden,

No.

or is it simply so astronomically unlikely (from a statistical perspective) that in reality it never happens?

Yes.

I'll extend my terse answer but don't want to go to long because frankly I don't think a long answer is needed for this question. I do not understand why physicists wring their hands so badly about this. Start with the atoms as they are in the picture on the left and remove the divider. Let the system evolve for 10 minutes. By our usual definition of entropy (related to the number of red and blue particles on each side) the system will basically have maximal entropy. Take a snapshot of the exact position and momentum of each particle.

Now, start over with the exact same number of particles. Place them in the exact positions needed, at at the start of the experiment give them a kick of momentum so they have the exact same momentum as they had at the end of the previous experiment. Newton's laws are reversible. This means the particles WILL go back to the configuration of all red on one side and all blue on the other side.

There should be absolutely nothing controversial about this. The initial state I described for the second experiment is a perfectly valid state within configuration space. Theoretically I'm allowed to specify ANY position and momentum that I like for all particles. Newton's laws are reversible. Period. This is explain my "No." answer to the OPs first question.

So that is the theoretical part of the answer. Now, the practical part of the answer. Why don't we ever see this happen? Well that has been answered in many words by all of the other answers here. The reason is that it is unbelievably unlikely. Calling it astronomically unlikely GREATLY overstates the magnitude of astronomical scales. This explains the "yes." answer to the OPs second question.

Now a little bonus that wasn't addressed by my answer yet: One way to the think about the 2nd law of thermodynamics is this. The entropy of a state tells you how statistically probable it would be to find the system in this state. The second law of thermodynamics says that over time it is HIGHLY likely that, compared to the state a system is in now, the state the system is in in the future is going to be a state that it is more statistically probable to find the system in. More sharply: "We are more likely to find a system in states which we are more likely to find a system in."

Answered by jgerber on September 28, 2020

While nothing's been proven, current theories posit that a black hole's entropy changes in inverse proportion to its mass/energy: i.e., when it decays, its entropy increases. Most black holes spend most of their early lifespans increasing in mass, and they would be decreasing in entropy during this time.

Now this isn't a net loss of entropy: the release of energy black holes produce ripping apart matter and--more than likely--spacetime leads to the inevitable net increase in entropy our favorite law of thermodynamics requires.

In the context of just the black hole and the matter it's vacuuming up: yes, entropy spontaneously decreases. But unless our entire universe were to be contained in a black hole, even these cosmological titans still produce a net increase of entropy.

Answered by ILikeCommas on September 28, 2020

Poincare recurrence has been mentioned in a comment by tusky_mcmammoth, but I think it's worth highlighting as an answer to illustrate both a piece of interesting mathematics and a limit of mathematical modeling.

A mathematical model of "particles in a box" treats the particles as points that elastically collide with each other and the container. Because the particles are confined and energy is conserved, the Poincare recurrence theorem actually guarantees that the system will with eventually return arbitrarily closely to its initial conditions!

Of course, in reality the universe will freeze to death first. The time this takes is enormous. (For example, this paper numerically computes Poincare recurrence times for completely integrable systems using some tricks from number theory.)

One could paraphrase the story of the butterfly and the diamond mountain to say:

There is a diamond mountain. Once every thousand years, a butterfly visits it and touches it once. By the time the butterfly has worn the mountain down to nothing, a complex system's Poincare recurrence time has just begun to elapse.

Answered by Neal on September 28, 2020

Decreasing entropy seems impossible, not just improbable.

Here's why:

  1. I am using the formula: $Omega = frac{N!}{j!*(N-j)!}$ where $Omega$ is the number of microstates (probability) of having j and (N-j) particles in each half of an isolated system (box), N is the number of particles in the system and j is the number of particles in one half while (N-j) is the number of particles in the other half. With this formula, I am getting the probability of a 2% fluctuation $frac{(N/2 - j)}{N/2}$ of 98% for N=200, 10% for N=1000, 0.1% for N=100,000 and one in 10^21 for N=1,000,000. Therefore, no decreasing entropy will ever be observed in a macro-system.
  2. Given ~10^25 molecules in a cubic meter, isolated microsystems of 1000 molecules are NOT realistic. Why assume they would behave like their macrosystem equivalents when just about anything has different properties at such small comparative scale? Let us NOT assume 1000 molecules behave just like 10^25 molecules.
  3. Arrow of time is given by entropy increase (think movie of an egg breaking). Since no one has ever witnessed the reversibility of the arrow of time, why assume such thing?
  4. The model used for systems of particles is demonstrably false. Pebbles on a Go board is NOT how gases behave. Pebbles stay put whereas compressed gases push back forcefully.

Answered by Nonlin.org on September 28, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP