Physics Asked by user27058 on September 4, 2021
I read somewhere people write gradient in covariant form because of their proposes.
I think gradient expanded in covariant basis $i$, $j$, $k$, so by invariance nature of vectors, component of gradient must be in contravariant form. However we know by transformation properties and chain rule we find it is a covariant vector. What is wrong with my reasoning?
My second question is: if gradient has been written in covariant form, what is the contravariant form of gradient?
A covariant vector is commonly a vector whose components are written with ``downstairs" index, like $x_{mu}$. Now, the gradient is defined as $partial_mu := dfrac{partial}{partial x^mu}$. As you can see the covariant vector $partial_mu$ is the derivative with respect to the contravariant vector $x^mu$. the contravariant form of $partial_mu$ is $partial^mu := g^{munu}partial_nu$ - and in case the metric is constant $partial^mu = frac{partial}{partial x_mu}$.
Sometimes the vector $partial_mu$ is used to indicate a coordinate basis of the tangent vector space in some point of a manifold. In that case the index $mu$ is not the index of the component (that for a vector should be upstairs), but it indicates the $mu$th vector of the basis.
Answered by Oscar on September 4, 2021
Gradient is covariant! Why?
The components of a vector contravariant because they transform in the inverse (i.e. contra) way of the vector basis. It is customary to denote these components with an upper index. So, if your coordinates are called $q$'s, they are denoted $q^i$.
Therefore, the gradient (or a derivative if you prefer) is $$partial_i = frac{partial}{partial q^i},$$ which transform as the inverse of the component transformation ( 1 / contra-variant = co-variant ).
If you're still not convinced... try this!
Cheers! ;-)
Answered by Dox on September 4, 2021
Recall the integral definition of the gradient:
$$nabla varphi = lim_{V to 0} frac{1}{V} oint_{partial V} varphi hat n , dS$$
This should tell you the gradient's components transform the same way as those of the normal vector $hat n$, which is known to have covariant components.
You can verify that the normal vector has covariant components by recalling that the normal can be defined through a cross product of tangent vectors (which have contravariant components; the cross product of true vectors is a pseudovector, which has covariant components), for instance.
Answered by Muphrid on September 4, 2021
Most of the answers posted here are incorrect. The Wikipedia page for the gradient says
The gradient of $f$ is defined as the unique vector field whose dot product with any vector $v$ at each point $x$ is the directional derivative of $f$ along $v$.
A look at Theodore Frankel's The Geometry of Physics confirms this. Other posters have said that the components of the gradient of $f$ are given by $partial_i f$; these are in fact the components of the differential of $f$, which is a covector. The gradient is this with the index raised.
Let's now calculate both sides of the expression from Wikipedia. The inner product of $mathrm{grad}(f)$ with a vector $v$ is $$ mathrm{grad}(f)^{i} g_{i j} v^j =mathrm{grad}(f)^i v_i. $$ The directional derivative of $f$ along $v$ is $$ D_v f = v^i partial_i f = g^{ij} partial_i f ~v_j. $$ We can clearly identify $$ mathrm{grad}(f)^{i} = g^{ij} partial_j f $$ or $$ mathrm{grad}f = g^{-1} mathrm{d} f. $$
Answered by ZachMcDargh on September 4, 2021
Gradient is covariant. Let's consider gradient of a scalar function. The reason is that such a gradient is the difference of the function per unit distance in the direction of the basis vector.
We often treat gradient as usual vector because we often transform from one orthonormal basis into another orthonormal basis. And in this case matrix transpose and inverse are the same.
Let $E, E'$ be matrices of basis vectors and $A$ be the transformation matrix between them.
$$ (E')^T = begin{bmatrix} hat{e}'_1 & hat{e}'_2 & vdots & hat{e}'_n end{bmatrix} = A begin{bmatrix} hat{e}_1 hat{e}_2 dots hat{e}_n end{bmatrix} = A E $$
Let $hat{v}, hat{v}'$ be vectors in respective bases. Then $$label{1} tag{1} hat{v} = v^ihat{e}_i = E^Tbegin{bmatrix}v_1v_2 vdots v_nend{bmatrix}$$ $$label{2} tag{2} hat{v}' = v^ihat{e}'_i = (E')^Tbegin{bmatrix}v'_1v'_2 vdots v'_nend{bmatrix} = (A E)^Tbegin{bmatrix}v'_1v'_2 vdots v'_nend{bmatrix} = E^TA^Tbegin{bmatrix}v'_1v'_2 vdots v'_nend{bmatrix} $$ From $ref{1}$ and $ref{2}$ we have: $$A^That{v}' = hat{v}$$ and finally $$boxed{hat{v}' = (A^T)^{-1}hat{v}}$$
Answered by Yola on September 4, 2021
I'll offer a simple explanation that relies only on the slope.
Suppose we have a line with slope 5, so for every 5 units up there corresponds one unit to the right. Now let us dilate the y axis by 3, that is, the spaces between each increment of 1 is now 3 times larger. In order for our slope to be invariant (equal 5), the slope's y component must also dilate by a factor of 3, that is, co-vary with the y axis.
Answered by Ken Wang on September 4, 2021
Allow me to try to provide the simplest explanation of why the gradient is a covariant vector.
By definitions, the components of a covariant vector transform obey the law : $$ overline A_i = sum_{j=1}^n frac {partial x^j} {partial overline x^i} A_j qquad qquad (1) $$
and the the components of a contravariant vector transform obey the law : $$ overline A^i = sum_{j=1}^n frac {partial overline x^j} {partial x^i} A^j qquad qquad (2)$$
If the components of gradient of a scalar field in coordinate system $ Bbb {mathit {x_j}} $ , namely $ frac {∂f} {∂x_j} $ , are known, then we can find the components of the gradient in coordinate system $ Bbb { overline { mathit {x_i}}}$, namely $ frac {∂f} {∂ overline x_j} $, by the chain rule:
$$ frac {partial f} {partial overline x_i} = frac {partial f} {partial x_1} frac {partial x_1} {partial overline x_i} + frac {partial f} {partial x_2} frac {partial x_2} {partial overline x_i} + cdotcdotcdot+ frac {partial f} {partial x_n} frac {partial x_n} {partial overline x_i} = sum_{j=1}^n frac {partial x_j} {partial overline x_i}frac {partial f} {partial x_j} $$
Obviously $ quad frac {partial f} {partial overline x_i}=overline A_i quad $ , and $ quad frac {partial f} {partial x_j} = A_j quad $, then we get the equation same as $ (1) quad $
Therefore, the gradient is a covariant vector.
Answered by The One on September 4, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP