TransWikia.com

Vector representation of wavefunction in quantum mechanics?

Physics Asked by user23145 on October 3, 2021

I am new to quantum mechanics, and I just studied some parts of “wave mechanics” version of quantum mechanics. But I heard that wavefunction can be represented as vector in Hilbert space. In my eye, this seems almost improbable. How does this work? Can anyone present an example and how it gets converted to vector form?

So I am referring to Dirac Bra-ket formulation of quantum mechanics.

3 Answers

The solution to the Schrodinger equation $ Psi $ is interpreted as a probability amplitude. In quantum mechanics the inner product is often taken as it is a way to obtain physically observable relationships such as the chance of a particle being at a certain position when measured. The inner product is defined for complex valued functions in a complex Hilbert space as $$ int_0^x Psi_1^*APsi_2dx $$ Where $^*$ denotes the complex conjugate and $A$ an operator that is applied to $Psi_2$. Keep in mind that a Hilbert space is just a vector space of finite or infinite dimension with an inner product defined on it. Now we can use bra-ket notation to make everything a lot neater. A ket is simply an $N$ dimensional vector and can just be thought as a column vector. $$ left|Crightrangle = left( begin{array}{c} c_1 c_2 vdots c_n end{array} right)=C(x) = sum_n c_n C_n(x) $$
Where $C_n$ are the basis vectors of the space. The bra is the hermitian conjugate of the ket which can be interpreted as $$ leftlangle Cright| = left|Crightrangle^{dagger}= left( begin{array}{cccc} c_1^*&c_2^* &cdots &c_N^* end{array} right) =sum_n c_n^* C_n(x) $$ Now when we combine the bra and the ket together we get a bracket and an it signifies the inner product on the vector space. Therefore we can represent our example from above in a much neater fashion using bras and kets $$ Psi_1^* = leftlangle Psi_1 right| $$ $$ Psi_2 = left|Psi_2rightrangle $$ and now our inner product can be defined as $$leftlangle Psi_1 middle| Psi_2 rightrangle = int_0^x Psi_1^*Psi_2dx $$ We can add our operator in like so $$ leftlangle Psi_1 middle| A middle| Psi_2 rightrangle =leftlangle Psi_1 middle| APsi_2 rightrangle=leftlangle A^daggerPsi_1 middle| Psi_2 rightrangle= int_0^x Psi_1^*APsi_2dx $$ Now in this case we were only assuming a function of a single dimension and therefore our vector space only had a single dimension. However, the nice thing is that bra-ket notation generalizes to higher dimension spaces very easily. In higher dimensional spaces operators will have to be of dimension $N times N $. In order to preserve the inner product of the space must be hermitian such that $$ A^dagger = A$$ The operator will be of the form $$ A^dagger = A = left( begin{array}{cccc} a_{00}&a_{01}&cdots &a_{0N} a_{10}&ddots&ddots&vdots vdots&ddots&ddots&vdots vdots&ddots&ddots&vdots a_{N0}&cdots&cdots&a_{NN} end{array} right) $$

I hope this gave some clarification. I'm by no expert and if I made some critical errors let me know, I'm always learning as well.

Answered by WhiteWhim on October 3, 2021

Chances are, you've already seen it happen in your past studies of math and just never heard it called that. In the following, I'm going to try to give an example to motivate the answer that it's not just possible, but quite ordinary, without going into rigorous detail.

The first thing to realize is that ordinary real- or complex- valued functions over the same domain form a vector space already. A vector space is simply a collection of objects that can be added together and scaled thought multiplication by suitable numbers (a field). The full definition is somewhat more involved than that, and can be found e.g., here, but it's obvious that one can add these functions and multiply them by scalars, and the other defining properties of "vector space" also hold (e.g., $vec{0}$ is the function that maps everything in the domain to $0$).

However, what we want is not just a vector space, but also to have an inner product generalizing the role of the dot product of vectors in $mathbb{R}^n$. There are infinitely many choices here, but one of the more generally useful ones is $$langle f|grangle = int f^*(x)g(x)text{,}$$ with the integral over the entire domain and $^*$ representing complex conjugation. This is an obvious generalization of the Euclidean dot product $$vec{u}cdotvec{v} = sum_k u_k v_ktext{,}$$ adjusted to handle complex-valued functions to ensure that $langle f|franglegeq 0$, i.e., that vectors have non-negative norm-squared. Recall that in Euclidean space, for a unit vector $hat{u}$, the dot product $hat{u}cdotvec{v}$ is the component of $vec{v}$ along $hat{u}$, so that the projection of $vec{v}$ onto $hat{u}$ must be $hat{u}(hat{u}cdotvec{v})$. Thus for an arbitrary vectors, not necessarily normalized, $$text{Projection of $vec{v}$ onto $vec{u}$} = vec{u}frac{vec{u}cdotvec{v}}{vec{u}cdotvec{u}}text{,}$$ so given an orthogonal basis ${vec{e}_k}$, we can write any vector in terms of components in that basis: $$vec{v} = sum vec{e}_kfrac{vec{e}_kcdotvec{v}}{vec{e}_kcdotvec{e}_k}text{.}$$

Ok, but what about functions? Does it make sense? Can we actually write something like: $$f(x) = sum g_k(x)frac{langle g_k|frangle}{langle g_k|g_krangle}$$ for a "basis" of ${g_k(x)}$ using that integral as an inner product to replace the dot product?

As a simple example, suppose we're dealing with real functions on $[-pi,pi]$, and I define the functions $$c_n(x) = cos(nx),;;;ngeq 0 s_n(x) = sin(nx),;;;n > 0$$ These vectors are orthogonal: for $nneq m,$ $$int_{-pi}^pi c_n c_m,mathrm{d}x = int_{-pi}^pi s_n s_m ,mathrm{d}x = int_{-pi}^pi c_n s_m ,mathrm{d}x = 0 = int_{-pi}^pi c_n s_n,mathrm{d}xtext{.}$$ Although it is not immediately obvious, they also form a basis: given $f:[-pi,pi]rightarrowmathbb{R}$, one can write $$f(x) = sum_{k = 0}^infty c_n(x)frac{langle c_n|frangle}{langle c_n|c_nrangle} + sum_{k=1}^infty s_n(x)frac{langle s_n|frangle}{langle s_n|s_nrangle}$$ And all I've done is written the standard Fourier series in vector notation, since if you actually do the integrals, then $langle c_0|c_0rangle = 2pi$ and $langle c_n|c_nrangle = langle s_n|s_nrangle = pi$ for $n>0$, while the numerators turn into the usual Fourier coefficients.

In other words, the Fourier series writes a function over a finite interval in terms of a particular countably infinite orthogonal basis ${1,cos(nx),sin(nx): n>0}$. Similarly, a Fourier transform can be thought of as writing a function in terms of an uncountably infinite orthogonal basis, so sums are replaced with integrals.

However, this means that both $f(x)$ and the list of Fourier coefficients give your the same information: they're just different representations of the same mathematical object, the vector. Therefore, as nervxxx notes, we can consider $f(x)$ to simply be some vector written in the position basis (an uncountably infinite basis) and don't consider the function as the fundamental object.

(N.B. there's nothing particularly special about the Fourier basis; many other choices are possible. Also, Hilbert spaces require a bit more than the mere existence of an inner product, but that turns out to hold here as well.)

Answered by Stan Liou on October 3, 2021

I think your confusion comes from misunderstanding the term 'vector'... which is understandable, because it's an overloaded term.

Traditionally, a "vector" is an arrow which exists in some physical space and usually represents some physical quantity like "velocity" or "force". But that's not what we're talking about here.

A "vector space" is a more abstract mathematical concept. It consists of a set $V$ of objects called "vectors" (which can be literally anything, they don't need to be arrows), a set $S$ of "scalars" (which again can be anything, not just real numbers), addition/multiplication operators, and a list of rules that all these things must follow. Anything which has these sets/operations/rules is called a "vector space".

The most common examples of vector spaces are the spaces of arrow-vectors. This is unsurprising, because the study of vector spaces stemmed from the study of arrow-vectors (hence the overlapping terms)! However, there are examples of vector spaces which have (conceptually) nothing to do with arrows:

  • For any field $F$, the set of polynomials over $F$ form a vector space
  • The set of $n times n$ matrices, with normal addition/scalar multiplication, forms a vector space
  • The set of single-variable real-valued functions, with normal addition/scalar multiplication, forms a vector space

As you can see from the last example, it's possible for the "vectors" of the vector space to be functions. In this case, the "vectors" are unrelated to arrow-vectors.


Side notes:

  • The "dimension" of a vector space is the minimum number of vectors that can be added together/multiplied-by-scalars to form all other vectors of the vector space. Because no finite list of single-variable real-valued functions is sufficient for this, we say that that vector space is "infinite dimensional". That makes it sound deep and complicated, but it's not.

  • The main difference between a Vector Space and a Hilbert Space is that a Hilbert Space has an operation called an "inner product". This can be any operation that follows certain rules. For arrow-vectors, this is the normal dot-product $a cdot b$, but for wave functions it's something completely different.

  • Because the "vectors" of a vector-space share so many properties with arrow-vectors, it's sometimes convenient to think of the "vectors" as arrow-vectors, even when they have nothing to do with arrows. Unfortunately, this worsens the "overlapping definitions" problem. For example, the vector space of $n times n$ matricies can be modeled as the $n^2$-dimensional space of arrow-vectors. Just don't try to visualize infinite-dimensional vector spaces this way!

  • To make matters even more confusing, wave functions in QM return complex numbers, and complex numbers are often represented by arrow-vectors. So, in one sense the functions are vectors (of a vector space), and in another sense they return vectors (representing complex numbers). Ahh! Fortunately, even when authors aren't precise in their language, it's usually obvious which definition they're talking about.

Answered by BlueRaja - Danny Pflughoeft on October 3, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP