Physics Asked by randomdude on December 18, 2020
I have a question concerning the time ordering operator. Let’s suppose we have a time evolution generated by some Hamiltonian $H(t)$ given by
$$
U(t)=T_leftarrowexpleft(-mathrm{i}int_0^tmathrm{d}s,H(s)right)tag{1}.
$$
In Breuer and Petruccione, it’s said that if the commutator of the Hamiltonian at some time $t$ with itself at some other time $t^prime$ is a $c$-number function/ a complex function, i.e. $left[H(t),,H(t^prime)right]=f(t,t’)$, then the time evolution is given by
$$
U(t)=expleft(-frac{1}{2}int_0^tmathrm{d}sint_0^tmathrm{d}s^prime,left[H(s),,H(s^prime)right]Theta(s-s^prime)right)expleft(-mathrm{i}int_0^tmathrm{ds},H(s)right),tag{2}
$$
where $Theta(s-s^prime)$ is the Heaviside function. No proof or reference is given and I could not find any explanation anywhere, this is why I ask here. Any help would be very appreciated.
You are looking for a mathematical result known as the Magnus expansion. In general, this gives an exact representation of the time-ordered matrix exponential $$V(t) = {rm T} exp left( int_0^t dt' , A(t')right),$$ in terms of an equivalent ordinary exponential $$ V(t) = expleft (S(t)right),$$ where $S(t)$ can be expressed as an infinite series of nested commutators, $S(t) = sum_{n=1}^infty S_n(t)$, e.g.begin{align} S_1 & = int_0^t dt_1, A(t_1), S_2 & = frac{1}{2}int_0^t dt_1 int_0^{t_1} dt_2, [A(t_1),A(t_2)], & vdots end{align} The next terms in the expansion involve higher-order commutators like $[A(t_3),[A(t_1),A(t_2)]]$, which obviously vanish when $[A(t_1),A(t_2)]$ is a $c$-number. See Blanes et al., Physics Reports 470 (2009), 151-238 for further details.
Answered by Mark Mitchison on December 18, 2020
OP's formula (2) is a continuum version of
$$ exp(A_n)ldots exp(A_1)~=~expleft(sum_{iin{1, ldots, n}} A_i + frac{1}{2}sum_{i,jin{1, ldots, n}}^{i>j} [A_i,A_j]right),tag{A}$$
or equivalently,
$$ exp(A_1)ldots exp(A_n)~=~expleft(sum_{iin{1, ldots, n}} A_i + frac{1}{2}sum_{i,jin{1, ldots, n}}^{i<j} [A_i,A_j]right),tag{B}$$
which are valid if we assume
$$ forall i,j,k~in~{1, ldots, n}: [[A_i,A_j],A_k]~=~0. tag{C} $$
Eq. (B) follows by repeated application of the truncated BCH formula:
$$ e^Ae^B~=~e^{A+B+frac{C}{2}}, qquad C~equiv~[A,B], qquad text{if}qquad [A,C]~=~0~=~[B,C]. tag{D}$$
Answered by Qmechanic on December 18, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP