Ngô Quốc Anh

December 19, 2012

How to decompose tensors into a purely spatial part and a timelike part?

Filed under: Riemannian geometry — Tags: — Ngô Quốc Anh @ 15:02

Today, we aim to talk about how to decompose tensors into a purely spatial part which lies in the hypersurfaces M and a timelike part which is normal to the spatial surface M?

Let us recall that M (called spatial surface) is a hypersurface of V (called the spacetime) of the dimension n+1. At each point p\in M, the space of all spacetime vectors can be orthogonally decomposed as

\displaystyle T_p(V)=T_p(M) \oplus \text{span}(n),

where \text{span}(n) stands for the 1-dimensional subspace of T_p(V) generated by the unit normal vector n to the surface M.

To do so, we need two projection operators.

The orthogonal projector onto M. In the literature, there exists such an operator, denoted by the symbol \gamma, given by

\displaystyle \begin{gathered} \gamma :{T_p}(V) \to {T_p}(M) \hfill \\ \qquad \quad\,\,\,v \mapsto v + (n \cdot v)n. \hfill \\ \end{gathered}

According to the above decomposition and thanks to

\displaystyle g_V = g_M + n \otimes n

with respect to any basis (e_i) of the space T_p(V), we have

\displaystyle \gamma_{ij}=g_{ij}+n_in_j,

which, by raising indices, gives

\displaystyle \gamma^i_{j}=\delta^i_{j}+n^in_j.

For any spacetime vector v=(v^j), since

\displaystyle\gamma _j^i{v^j} = (\delta _j^i + {n^i}{n_j}){v^j} = \delta _j^i{v^j} + \underbrace {{n^i}{n_j}}_{ - \delta _j^i}{v^j} = 0

we know that \gamma _j^i{v^j} is purely spatial. This supports our construction of the orthogonal projection \gamma.

To project higher rank tensors into the spatial surface, each free index has to be contracted with a projection operator. It is sometimes convenient to denote this projection with a symbol \top, for example,

\displaystyle\top{T_{ij}} = \gamma _i^p\gamma _j^q{T_{pq}}, \quad \top{T^{ij}} = \gamma _p^i\gamma _q^j{T^{pq}}.

It can be checked that \gamma is nothing but the a spatial (induced) metric of M.

The normal projector onto \text{span}(n). According to the decomposition

\displaystyle \gamma^i_{j}=\delta^i_{j}+n^in_j,

one can easily see that the term \delta^i_{j}-\gamma^i_{j} gives us the normal part after the decomposition. In other words, the normal projector is nothing but

\displaystyle N^i_j=-n^in_j.

We can denote the normal projector by \bot, for example, for any spacetime vector v=(v^i), we obtain

\displaystyle\bot v = N_j^i{v^j} = - {n^i}{n_j}{v^j}.

Once we have the above two projector, we can now use these two projection operators to decompose any tensor into its spatial and timelike parts. For example, given an arbitrary spacetime vector v^a, we obtain

\displaystyle {v^a} = \delta _b^a{v^b} = (\gamma _b^a + N_b^a){v^b} = \gamma _b^a{v^b} + N_b^a{v^b} = \top {v^b} - {n^a}{n_b}{v^b}.

Now, given a (0,2)-tensor T, we get

\displaystyle\begin{gathered} {T_{ab}} = \delta _a^c\delta _b^d{T_{cd}} \hfill \\ \quad\,\,\,\,= \delta _a^c(\gamma _b^d + N_b^d){T_{cd}} \hfill \\ \quad\,\,\,\,= (\gamma _a^c + N_a^c)(\gamma _b^d + N_b^d){T_{cd}} \hfill \\ \quad\,\,\,\,= (\gamma _a^c + N_a^c)(\gamma _b^d{T_{cd}} - {n^d}{n_b}{T_{cd}}) \hfill \\ \quad\,\,\,\,= \gamma _a^c\gamma _b^d{T_{cd}} + N_a^c\gamma _b^d{T_{cd}} - \gamma _a^c{n^d}{n_b}{T_{cd}} - N_a^c{n^d}{n_b}{T_{cd}} \hfill \\ \quad\,\,\,\,= \top {T_{ab}} - {n^c}{n_a}\underbrace {\gamma _b^d{T_{cd}}}_{ \top {T_{cb}}} - {n^d}{n_b}\underbrace {\gamma _a^c{T_{cd}}}_{ \top {T_{ad}}} + {n^c}{n_a}{n^d}{n_b}{T_{cd}}. \hfill \\ \end{gathered}

Thus, after re-indexing, we obtain

\displaystyle {T_{ab}} = \top {T_{ab}} - {n_a}{n^c} \top {T_{cb}} - {n_b}{n^c} \top {T_{ac}} + {n_a}{n_b}{n^c}{n^d}{T_{cd}}.

Finally, for the Riemann curvature tensor {}^{(n+1)}R, we obtain the following decomposition

\displaystyle\begin{gathered} {R_{abcd}} = \delta _a^p\delta _b^q\delta _c^r\delta _d^s{R_{pqrs}} \hfill \\ \qquad\,\,= \delta _a^p\delta _b^q\delta _c^r(\gamma _d^s{R_{pqrs}} + N_d^s{R_{pqrs}}) \hfill \\ \qquad\,\,= \delta _a^p\delta _b^q(\gamma _c^r\gamma _d^s{R_{pqrs}} + N_c^r\gamma _d^s{R_{pqrs}} + \gamma _c^rN_d^s{R_{pqrs}} + N_c^rN_d^s{R_{pqrs}}) \hfill \\ \qquad\,\,= \delta _a^p(\gamma _b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + N_b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + \gamma _b^qN_c^r\gamma _d^s{R_{pqrs}} + N_b^qN_c^r\gamma _d^s{R_{pqrs}} \hfill \\ \qquad\,\,\qquad\,\,+ \gamma _b^q\gamma _c^rN_d^s{R_{pqrs}} + N_b^q\gamma _c^rN_d^s{R_{pqrs}} + \gamma _b^qN_c^rN_d^s{R_{pqrs}} + N_b^qN_c^rN_d^s{R_{pqrs}}) \hfill \\ \qquad\,\,= (\gamma _a^p + N_a^p)(\gamma _b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + N_b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + \gamma _b^qN_c^r\gamma _d^s{R_{pqrs}} + N_b^qN_c^r\gamma _d^s{R_{pqrs}} \hfill \\ \qquad\,\,\qquad\,\,+ \gamma _b^q\gamma _c^rN_d^s{R_{pqrs}} + N_b^q\gamma _c^rN_d^s{R_{pqrs}} + \gamma _b^qN_c^rN_d^s{R_{pqrs}} + N_b^qN_c^rN_d^s{R_{pqrs}}) \hfill \\ \qquad\,\,= (\gamma _a^p\gamma _b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + \gamma _a^pN_b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + \gamma _a^p\gamma _b^qN_c^r\gamma _d^s{R_{pqrs}} + \gamma _a^pN_b^qN_c^r\gamma _d^s{R_{pqrs}} \hfill \\ \qquad\,\,\qquad\,\,+ \gamma _a^p\gamma _b^q\gamma _c^rN_d^s{R_{pqrs}} + \gamma _a^pN_b^q\gamma _c^rN_d^s{R_{pqrs}} + \gamma _a^p\gamma _b^qN_c^rN_d^s{R_{pqrs}} + \gamma _a^pN_b^qN_c^rN_d^s{R_{pqrs}}) \hfill \\ \qquad\,\,\qquad\,\,+ (N_a^p\gamma _b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + N_a^pN_b^q\gamma _c^r\gamma _d^s{R_{pqrs}} + N_a^p\gamma _b^qN_c^r\gamma _d^s{R_{pqrs}} + N_a^pN_b^qN_c^r\gamma _d^s{R_{pqrs}} \hfill \\ \qquad\,\,\qquad\,\,+ N_a^p\gamma _b^q\gamma _c^rN_d^s{R_{pqrs}} + N_a^pN_b^q\gamma _c^rN_d^s{R_{pqrs}} + N_a^p\gamma _b^qN_c^rN_d^s{R_{pqrs}} + N_a^pN_b^qN_c^rN_d^s{R_{pqrs}}). \hfill \\ \end{gathered}


Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog at

%d bloggers like this: