TransWikia.com

Vector transformation derived from partial derivatives

Physics Asked on February 5, 2021

I am studying the lecture on general relativity for beginners here: https://youtu.be/foRPKAKZWx8?t=1998

I was able to follow the steps up until this point (33:18 in the video). During the deriving of the metric tensor, he is talking about how the vector transforms in a change of basis. He derived a formula using a gradient formula with partial derivatives by simply replacing the gradient by a vector.

enter image description here

The circled formula is what he derived from the boxed formula above. As I understand, X and Y here are frames of reference (rather than coordinates). How is the partial derivative of Y with respect to X obtained?

2 Answers

In physics, a "frame" is nothing more than another word for a coordinate system. There are distinct objects known as frame fields, but by the looks of it, this is not what the lecturer is describing (they are, in my opinion, a more philosophically pleasing description of things, but more mathematically involved.

So, in the end there are some coordinates $x$ and some coordinates $y$. These are related to each other in the sense that one set of coordinates can be written as functions of the other set. For example, we may write $x^mu=x^mu(y)$ or $y^mu=y^mu(x)$, depending on whether we want to describe things in terms of the $x$ or $y$ coordinates.

As an example, in two dimensions, we may describe things in terms of Cartesian coordinates $(x,y)$ or polar coordinates $(r,theta)$ and we may write $$ r(x,y)=sqrt{x^2+y^2}, theta(x,y)=tan(y/x). $$ Similarly, we could invert this coordinate transformation and write $$ x(r,theta)=rcostheta, y(r,theta)=rsintheta. $$

So, writing the Jacobian circled in the question out more fully we would have $$ J^mu_nu=frac{partial y^mu}{partial x^nu}=frac{partial y^mu(x)}{partial x^nu}. $$

Answered by Richard Myers on February 5, 2021

The coordinates are functions on a manifold. Say you have two sets of coordinates $y^i$, $x^i$ and transformation relation $y^i(x^j).$ Forming a gradient of these functions gives you:

$$dy^i=sum_jfrac{partial y^i}{partial x^j}dx^j$$

Now this gradient is a linear machine (1-form), that takes vector as an input and tells you how quickly does the coordinate changes in the direction of the vector. So let us apply it to vector $vec{V}=sum_i V^i_yvec{e^y_i}=sum_i V^i_xvec{e^x_i}$, where $vec{e^y_i}$ is coordinate basis vector for coordinate $y^i$ (and analogically for $x$ coordinates):

$$dy^i(vec{V})=sum_jV^j_ydy^i(vec{e^y_j})=V^i_y$$ $$sum_jfrac{partial y^i}{partial x^j}dx^j(vec{V})=sum_{j,k}frac{partial y^i}{partial x^j}V^k_xdx^j(vec{e^x_k})=sum_{j}frac{partial y^i}{partial x^j}V^j_x$$

In the first equality, I have used the fact, that 1-form is a linear machine. In second equality the fact that $dy^i(vec{e^y_j})=frac{partial y^i}{partial y^j}=delta^i_j$. The rest is putting these two results into the first formula for gradient and you will get the desired formula.

P.S.

I think calling $df$ a gradient is not correct terminology. $df$ is a 1-form, while gradient is a vector ($vec{nabla} f$). Gradient is defined as dual vector through metric.

Answered by Umaxo on February 5, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP