Physics Asked by OllieGrayson on August 13, 2021
I’m working through a proof in Griffith’s Quantum Mechanics book (Chapter 1.4 – Normalization) and feel like a subtle detail is being omitted. If anyone can supply clarity that would help.
We have $Psi = Psi(x,t)$ and the author ends up reaching this point in the proof.
$$frac{d}{dt}int^infty_{-infty}|Psi|^2 dx = aint^infty_{-infty}frac{partial}{partial x}bigg(Psi^*frac{partialPsi}{partial x}-frac{partialPsi^*}{partial x}Psibigg)dx$$
where $a$ is a constant. Up to this point, I follow. Then the author claims
$$aint^infty_{-infty}frac{partial}{partial x}bigg(Psi^*frac{partialPsi}{partial x}-frac{partialPsi^*}{partial x}Psibigg)dx = abigg(Psi^*frac{partialPsi}{partial x}-frac{partialPsi^*}{partial x}Psibigg)bigg|^infty_{-infty} = 0.$$
Based on the fact the wave function must vanish at infinities. Here is my issue:
Let
$bigg(Psi^*frac{partialPsi}{partial x}-frac{partialPsi^*}{partial x}Psibigg) = f(x,t)$
as I assume it must. Isn’t it true then that
$aint^infty_{-infty}frac{partial f(x,t)}{partial x}dx = af(x,t)bigg|^infty_{-infty}+aC(t)$
where $C(t)$ is the constant of integration with respect to $x$.
So the author is saying the first term goes to zero physically but makes no mention of the second term which would be only a function of $t$. Am I missing a physical point of intuition here? Or am I perhaps wrong in my integration of partial derivatives? Any help is greatly appreciated.
For context, this proof is being done to show that the normalization of the wave function does not change over time.
I believe you are confused because you may have heard that if you have a function $f(x,t)$, then
$$intfrac{partial f}{partial x} text{d} x = f(x,t) + C(t),$$
and this is indeed true. However, this changes when you have a definite integral, as the $C(t)$ cancels out when you take the difference of the boundary terms. Here's how to see it explicitly:
begin{aligned}int_{a}^bfrac{partial f}{partial x} text{d} x = left( f(x,t) + C(t)right)Big|_{a}^b &= Big(f(b,t) + C(t)Big) - Big(f(a,t) + C(t)Big) &= f(b,t) - f(a,t) &= f(x,t)Big|_{a}^b,end{aligned}
as it should be. (It's the same reason you don't have constants of integration for definite integrals of one dimension.)
Answered by Philip on August 13, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP