Back

metric:
$\textbf{g} = \textbf{e} \cdot \textbf{e}$
$\textbf{g} = (e^a \otimes e_a) \cdot (e^b \otimes e_b)$
TODO explain why $a \otimes s \cdot b \otimes t = a \otimes b \cdot s \cdot t$ for forms a,b and vectors s,t.
$\textbf{g} = (e^a \otimes e^b) (e_a \cdot e_b)$
$\textbf{g} = g_{ab} \cdot e^a \otimes e^b$
Therefore $g_{ab} = e_a \cdot e_b$

expanding our non-coordinate basis in terms of a coordinate basis:
$g_{ab} = {e_a}^\tilde{a} {e_b}^\tilde{b} \partial_\tilde{a} \cdot \partial_\tilde{b}$

Let our coordinate basis metric be defined in the same way:
$\tilde{g}_{\tilde{a}\tilde{b}} = \partial_\tilde{a} \cdot \partial_\tilde{b}$

So $g_{ab} = {e_a}^\tilde{a} {e_b}^\tilde{b} \tilde{g}_{\tilde{a}\tilde{b}}$
and ${e^a}_\tilde{a} {e^b}_\tilde{b} g_{ab} = \tilde{g}_{\tilde{a}\tilde{b}}$

Metric inverse:
Define $g^{ab}$ such that $ g^{ac} g_{cb} = \delta^a_b$
This means, if the coefficients are written as a matrix $[g_{ab}]$, then the coefficients of $[g^{ab}]$ would be the elements of that matrix' inverse.

Raising / lowering indexes:
Contravariant form is defined as: $v^a = g^{ab} v_b$
Covariant form is defined as: $v_a = g_{ab} v^b$

Notice the equivalence: $v = v^a e_a = v_b g^{bc} g_{ca} e^a = v_b \delta^b_a e^a = v_a e^a$
However notice that $e^a$ is a vector, while $e_a$ is a one-form, which is a function $V \rightarrow \mathbb{R}$.
Let $v^\sharp$ denote when to represent v as $v^a e_a$.
Let $v^\flat$ denote when to represent v as $v_a e^a$.
True for multiple indexes too?
So for $T = {T^I}_J {e_I}^J$, $T^\sharp = T^{IJ} e_{IJ}$, and $T^\flat = T_{IJ} e^{IJ}$

Chaning component of coordinate basis:
$v = v^a \frac{\partial}{\partial x^a}$
$v = (v')^b \frac{\partial}{\partial y^b}$
$v = (v')^b \frac{\partial x^a}{\partial y^b} \frac{\partial}{\partial x^a}$
Therefore $(v')^b \frac{\partial x^a}{\partial y^b} = v^a$

Changing component representation of non-coordinate basis:
$v^\tilde{a} = {e_a}^\tilde{a} v^a$
$v_\tilde{a} = {e^a}_\tilde{a} v_a$

Metric coordinate invariance:
For coordinate metric $g_{ab}$:
Let $g' = {g'}_{a'b'} dx^{a'} \cdot dx^{b'}$
$ = {g'}_{a'b'} \frac{\partial x^a}{\partial x^{a'}} dx^a \cdot \frac{\partial x^b}{\partial x^{b'}} dx^b$
$ = {g'}_{ab} dx^a \cdot dx^b$ by transforming the indexes of ${g'}_{a'b'}$
Therefore $g$ = $g'$ regardless of change-of-coordinates.

Metric determinant:
$det([\tilde{g}_{\tilde{a}\tilde{b}}]) = det( [{e^a}_\tilde{a} g_{ab} {e^b}_\tilde{b}] )$
$det([\tilde{g}_{\tilde{a}\tilde{b}}]) = det( [{e^a}_\tilde{a}] )^2 det([g_{ab}])$
$\tilde{g} = e^2 g$
$e = \sqrt{\frac{\tilde{g}}{g}}$



All metrics can be diagonalized into a signature metric.
Let $\eta_{ij} = diag(\sigma_i) = \delta_{ij} \cdot \sigma_i$, where $\sigma_i = \pm 1$.
Let its signature be $\sigma = \overset{n}{\underset{i=1}{\Pi}} \sigma_i = det[\eta_{ij}] = \pm 1$.
Euclidian diagonalization: $\sigma_i = \{1, 1, 1\}; \sigma = 1$
Lorentzian signature for particle physicists: $\sigma = \{+1, -1, -1, 1\}; \sigma = -1$
Lorentzian signature for cosmologists: $\sigma = \{-1, +1, +1, +1\}; \sigma = -1$

Any $e_a$ coordinate basis can be transformed into an $(e')_i = {(e')^a}_i e_a$ non-coordinate basis, such that $(e')_i \cdot (e')_j = \eta_{ij}$.
In fact, if the metric is eigen-decomposed, then any number of $Q$ exist such that $Q V Q^{-1} = G$.
This means that, while the basis $(e')_i$ has a Minkowski signature, they may not necessarily be holonomic ... and therefore will have structure constants ...
Structure constants imply asymmetric connection on the 2nd and 3rd indexes, and a constant (Minkowski) metric as well implies antisymmetric on the 2nd and 3rd indexes.

How about uniqueness? For a specific $\eta_{ij}$ diagonalization, there should be a unique ${(e')^a}_i$ transformation from the coordinate basis $\partial_a$ to the non-coordinate $(e')_i = {(e')e^a}_i \partial_a$.



Sums of metrics

Check out my derivations at https://thenumbernine.github.io/symmath/tests/output/sum%20of%20two%20metrics.html.

The perturbation folks like to use the sum of two metrics: $g_{ab} = \eta_{ab} + \epsilon_{ab}$.

But really, isn't this a short-sighted idea? If we are dealing with perturbations of basis, shouldn't we be using an exponential map of a perturbation instead?

$\Theta = \Theta_\epsilon + \Theta_0$ and $E = exp(\Theta) = exp(\Theta_\epsilon) \cdot exp(\Theta_0)$.

$G = E E^T = exp(\Theta_\epsilon) \cdot exp(\Theta_0) \cdot exp(\Theta_0)^T \cdot exp(\Theta_\epsilon)^T = exp(\Theta_\epsilon) \cdot G_0 \cdot exp(\Theta_\epsilon)^T$


Inverse of sums of metrics

$g_{ab} = g'_{ab} + g''_{ab}$

Using 1981 Miller's paper on the inverse of the sum of matrices
Let ${r^a}_b = g'^{ac} g''_{cb}$.
Let $r = g'^{uv} g''_{uv} = {r^a}_a$.
$g^{ab} = g'^{ab} - (g'^{ac} g''_{cd} g'^{db}) \cdot \frac{1}{1 + r}$


Back