# Ngô Quốc Anh

## February 3, 2011

### Components of a tensor under a change of coordinates

Filed under: Riemannian geometry — Ngô Quốc Anh @ 19:29

During the Cosmology class several days ago, the lecturer told us the following interesting thing that would be useful for beginner studying Riemannian geometry.

Let us consider a covariant vector, known as $(1,0)$-tensor, $v_\alpha dx^\alpha$. The word covariant says that the components, $v_\alpha$, vary in the same “direction” as the change of coordinates. To be precise, under a change of coordinates

$\displaystyle x^\alpha \mapsto x^{\overline \alpha}$

we have

${v_\alpha }d{x^\alpha } = \underbrace {{v_\alpha }\frac{{\partial {x^\alpha }}}{{\partial {x^{\overline \alpha }}}}}_{{v_{\overline \alpha }}}d{x^{\overline \alpha }}.$

Now let us consider a contravariant vector, also known as $(0,1)$-tensor, $v^\alpha \frac{\partial}{\partial x^\alpha}$. The word contravariant says that the components, $v_\alpha$, vary in the opposite “direction” as the change of coordinates. To be precise, under a change of coordinates as above, we obtain

$\displaystyle {v^\alpha }\frac{\partial }{{\partial {x^\alpha }}} = {v^\alpha }\frac{{\frac{\partial }{{\partial {x^\alpha }}}}}{{\frac{\partial }{{\partial {x^{\overline \alpha }}}}}}\frac{\partial }{{\partial {x^{\overline \alpha }}}} = \underbrace {{v^\alpha }\frac{{\partial {x^{\overline \alpha }}}}{{\partial {x^\alpha }}}}_{{v^{\overline \alpha }}}\frac{\partial }{{\partial {x^{\overline \alpha }}}}$.

In practice, we usually omit either $dx^\alpha$ or $\frac{\partial}{\partial dx^\alpha}$. So, simply we write

$\displaystyle {v_{\overline \alpha }} = {v_\alpha }\frac{{\partial {x^\alpha }}}{{\partial {x^{\overline \alpha }}}},\quad {v^{\overline \alpha }} = {v^\alpha }\frac{{\partial {x^{\overline \alpha }}}}{{\partial {x^\alpha }}}$

or equivalently

$\displaystyle {v_\alpha } = \frac{{\partial {x^{\overline \alpha }}}}{{\partial {x^\alpha }}}{v_{\overline \alpha }}, \quad {v^\alpha } = \frac{{\partial {x^\alpha }}}{{\partial {x^{\overline \alpha }}}}{v^{\overline \alpha }}.$

So, these rules suggest us to calculate easily for a general $(p,q)$-tensor. For example, for a $(0,2)$-tensor $g_{\alpha\beta}$ (you may think this is a metric tensor as well), we have

1. We transform lower index $\alpha$ to lower index $\overline \alpha$, in order to do this, we multiply by a term of the form $\frac{\partial x}{\partial x}$. If the LHS comes with a lower index then the denominator contains the origin basis. If the LHS comes with an upper index, the origin basis moves to the numerator. To be exact, here term term that we need to multiply is nothing but

$\displaystyle\frac{\partial x^{\overline \alpha}}{\partial x^\alpha}.$

2. We now transform the second lower index $\beta$ to the lower index $\overline \beta$. Easily, we get the term

$\displaystyle\frac{\partial x^{\overline \beta}}{\partial x^\beta}.$

3. Overall, we conclude that

$\displaystyle {g_{\alpha \beta }} = \frac{{\partial {x^{\overline\alpha} }}}{{\partial {x^{\alpha }}}}\frac{{\partial {x^{\overline\beta} }}}{{\partial {x^{\beta }}}}{g_{\overline \alpha \overline \beta }}.$

Another example is that what happens to a $(1,1)$-tensor ${g^\alpha}_\beta$. Again,

1. We transform upper index $\alpha$ to upper index $\overline \alpha$. Followed by the above discussion, the term we need to multiply is nothing but

$\displaystyle\frac{\partial x^{\alpha}}{\partial x^{\overline\alpha}}.$

2. We transform lower index $\beta$ to lower index $\overline \beta$ by the following term

$\displaystyle\frac{\partial x^{\overline \beta}}{\partial x^\beta}.$

3. Adding up gives us

$\displaystyle {{g^\alpha}_\beta } = \frac{{\partial {x^{ \alpha }}}}{{\partial {x^{\overline\alpha} }}}\frac{{\partial {x^{\overline\beta} }}}{{\partial {x^{\beta }}}}{g^{\overline \alpha }}_{\overline \beta }.$

Lastly, for a $(2,0)$-tensor, it is easy to see that

$\displaystyle {g^{\alpha \beta }} = \frac{{\partial {x^{ \alpha }}}}{{\partial {x^{\overline\alpha} }}}\frac{{\partial {x^{\beta }}}}{{\partial {x^{\overline\beta} }}}{g^{\overline \alpha \overline \beta }}.$

## Leave a Comment »

No comments yet.

This site uses Akismet to reduce spam. Learn how your comment data is processed.