Physics Asked by GiorgioP on April 16, 2021
The Langevin equation is a stochastic differential equation for the velocity of one degree of freedom performing Brownian motion. It is supposed to describe the motion of a big particle at a much longer time scale than that of the collision with the solvent molecules. The resulting trajectory is a nowhere differentiable curve and also the derivative of the velocity appearing in the equation has to be treated carefully to avoid mathematically ill-defined relations. From the conceptual point of view, the velocity appearing in the Langevin equation looks like a more sophisticated concept than in classical mechanics.
However, the asymptotic limit of the equal time autocorrelation function $lim_{t rightarrow +infty}langle v^2 (t)rangle$ is put equal to $3k_BT/m$ such that it obeys the equipartition theorem which is derived in term of the usual microscopic velocity.
The question is what is the missing conceptual link between the velocity appearing in the Langevin equation, which contains a stochastic component, and the velocity appearing in the equipartition theorem which is usually derived in terms of the momenta of the atomic Hamiltonian (deterministic evolution) and not in terms of coarse-grained stochastic variables. If there is any link to discussions of such an issue in the literature will be highly appreciated. I have a vague recollection of finding something in the past but I am not able to find it again.
If we look at the problem of a diffusing particle with infinite resolution, there really is no problem: we see a particle, then we see a fluid molecule hitting it, momentum is transferred, and so on. Its speed $v$ would be a perfectly defined speed. Nothing strange. The problem, and the reason we model this as a Langevin process, is that we do not have access to this time scale and thus we require a coarse-grained modelling, a stochastic one. The price we pay is that the speed of the particle $v$ becomes a not well-defined quantity becuase we are averaging over multiple collisions. However, some physical facts are certain and whatever model we use, we need to make sure they are still valid. Equipartition is one of them.
In general, I think there are multiple ways to answer this question, going through stochastic differential equations, Ito & Stratonovich etc. but I would say at the very core of the argument you would find the fluctuation-dissipation theorem or, in the specific case of Brownian motion, the fact that the object is in a fluid and therefore experiences a drag is connected to its random displacement because of collision with the fluid molecules.
This implies - and this was Einstein's great contribution - the fact that mesoscopic particles, the one you would usually describe with a Langevin equation, are in equilibrium with the surrounding fluid so whatever $v(t)$ the particle has, which in principle can even be ill-defined, we definitely know that its fluctuation are going to be in agreement with equipartition [in 1D]:
$$langle v^2 rangle = k_B T / M$$
$M$ of course being its mass.
Just to clarify, here $langle v^2 rangle$ is not the $v$ of the gas but the $v$ of the particle. What we are saying is that the particle is in equilibrium with a fluid with temperature $T$ and therefore its kinetic energy must be the one predicted by equipartition.
Now, this is what physics tells us: a mesoscopic particle has $langle v^2 rangle = k_B T / M$ and/or $langle x^2 rangle = 2Dt$ in the case of Brownian motion.
This two values are not results of the Langevin equation or of the modelling, they are dictated by the fluctuation dissipation theorem. So, when we write the Langevin equation for a particle, and we introduce the usual "weird" noise/force $eta(t)$ we need to make sure $eta(t)$ is such that it satisfies the above relationships.
We write $$M{d vover dt} = -gamma v+Qeta(t)$$
where $Q$ is the "strength" of the noise. We go on with the derivation, so I will skip it for now - you seem to be familiar with it - and at the end we set $Q$ such $langle v^2 rangle = k_B T / M$ . This is given by general physical principles.
The $v$ appearing in Langevin's equation would be the real speed of your particle, in this approach. But becuase you are messing with the random force $eta$ which -physically- is not really delta-correlated, you are making it evolve in the "wrong", un-physical way. This latter fact makes it so that the speed $v$ is not well defined anymore, because you are describing it in terms of a weird, non-differentiable noise term. But, OK, $v$ is not-differentiable but thermodinamically, from physical first principles, we can tell something about $v$ at the scale we are looking our problem at: we know that $langle v^2 rangle$: this does not depend on our time resolution and that is something we try to make compatible with Langevin's approach. Still, while $langle v^2 rangle$ is now right, $v$ is just a coarse-grain.
However, if we were to refine our model and include some "coherency" in $eta$ (not delta-correlated anymore but with some correlation, given by the fact that a molecular collision is not really instantaneous but has a finite duration) then we would get a better description of the "real" $v$. I think this is "the missing link" you are looking for: a description which, starting from an Hamiltonian, gives the best possible coarse-grain for the speed, including its coherence in time.
One could indeed model the noise differently, maybe with a Ornstein-Uhlenback process, and then instead of a weird non-differentiable trajectory you get a weird-non differentiable speed and a perfectly fine trajectory. You could model the noise in a different way and make it so that $v$ - in the Langevin equation - is as close as possible to the "physical" speed of the particle.
A Langevin-like treatment with the usual $eta$ noise completely neglects this puzzling cases, averages out several collisions over a small time, ignores time correlation, focuses on getting diffusion or $langle v^2 rangle$ right - finer details are also very hard to access from an experimental point of view anyway! $v$ is not the real $v$, it is a sampling of $v(t)$ at times bigger than its variation time-scale. So we choose not to describe it explicitly but as a random process.
Nowadays, with modern techniques, the instantaneous $v$ of particles can be measured in a more precise way and we know that at the atto-scopic timescale Langevin's equation is wrong: Brownian particles have a ballistic behavior at short timescales and as long as they are ballistic we can differentiate their position to get their speed. The real $v$ is correlated in time -albeit for a short time. As expected, from a physical point of view.
But if we look at particles at a bigger timescale, we get Brownian motion because we lack the resolution to see the fine details and we loose the possibility of differentiating it because its variation are below our observation ability. But they are there: we just model them with a weird force $eta$ and get rid of the problem.
Also, if you were to model fluid molecules explicitly, according to their Hamiltonian (say, at time $t_a$ the moleculer $a$ hits our particle with momentum $p_a$) you would get a description which is differentiable, so you would get a "physical" $v$: nice, precise, analytic etc. - but it's a lot of work and we can neglect it using $eta$ and paying the price of non-differentiability of $v$).
On the other hand, equipartition always holds at all timescales. So our modelling using Langevin equation must take it into account.
Correct answer by JalfredP on April 16, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP