Parallel propagator along a single coordinate:
${P_\mu(x_L^\mu, x_R^\mu)^\alpha}_\nu = exp\left( -\int_{x'^\mu = x_L^\mu}^{x'^\mu = x_R^\mu} \Gamma_\mu(x'^\mu) dx'^\mu \right)$
...where $\Gamma_\mu$ is a matrix with components ${\Gamma^\alpha}_{\mu\nu}$
Parallel propagator linearly interpolated between coordinates:
$P_\mu(x,y) = exp\left( -\int_{\lambda = 0}^{\lambda = 1} \Gamma_\mu(x + \lambda v) v^\mu d\lambda \right)$
Matrix exponential:
$exp(A) = exp(R \cdot \Lambda \cdot L) = R \cdot exp(\Lambda) \cdot L$
$R^{-1} \cdot exp(A) \cdot L^{-1} = exp(\Lambda)$
$\Lambda = log(R^{-1} \cdot exp(A) \cdot L^{-1})$
$A = R \cdot \Lambda \cdot L = R \cdot log(R^{-1} \cdot exp(A) \cdot L^{-1}) \cdot L$
$A = log(exp(A)) = log(exp(R \cdot \Lambda \cdot L)) = log(R \cdot exp(\Lambda) \cdot R)$
TODO for $A = R_A \cdot \Lambda_A \cdot L_A$, prove $log(A) = R_A \cdot log(\Lambda_A) \cdot L_A$
Let $B = log(A)$, so $exp(B) = A$
and $B = R_B \cdot \Lambda_B \cdot L_B$
and (by argument of diagonalization of $exp(A_{ij}) = \delta_{ij} exp(A_{ij})$) we know that $R_B = R_A, L_B = L_A$
so $\Lambda_B = log(\Lambda_A)$
so $log(A) = R_A \cdot log(\Lambda_A) \cdot L_A$
where, for diagonal matrix, $(log(\Lambda_A))_{ij} = 0$ for $i \ne j$, or $log(A_{ij})$ for $i = j$.
Flux
Jacobian Eigendecomposition:
$F^\alpha =$ flux.
${A^\alpha}_\mu = \frac{\partial F^\alpha}{\partial U^\mu} =$ flux Jacobian.
$A = R \cdot \Lambda \cdot L$
Let $-\int \Gamma_\mu v^\mu d\lambda = -\int \Gamma(\lambda, v) d\lambda = A$ ... notice the first $\Gamma$ parameter is its dependent variable $\lambda$, while the second is the vector $v^\mu$ that the one-form $\Gamma$ is actiong upon. Maybe it would be more proper math to write $\Gamma(\lambda)(v)$?
So $A = R \cdot \Lambda \cdot L = -\int \Gamma_\mu v^\mu d\lambda$
And $exp(-\int \Gamma_\mu v^\mu d\lambda) = R \cdot \Lambda \cdot L$
And $-\int \Gamma_\mu v^\mu d\lambda = R \cdot log(\Lambda) \cdot L$
How about if we say $\nabla_{e_\mu} e_\nu = {\Gamma^\alpha}_{\mu\nu} e_\alpha$ is equivalent to $\frac{\partial F^\alpha}{\partial U^\mu}$?
How about if $-{\Gamma^\alpha}_{\mu\nu} \hat{x}^\mu = {A^\alpha}_\nu$?
$\int -{\Gamma^\alpha}_{\mu\nu} v^\mu d\lambda = \int {A^\alpha}_\nu d\lambda$
Then there's Dullemond, Wang's Hydrodynamics notes:
Start with $\partial_t U + \partial_{x^i} F^i = 0$
(In arbitrary curvilinear coordiantes $\partial_t U + \nabla_i F^i = 0$)
eqn. 7.36:
$F(U(x_R)) - F(U(x_L)) = \int_0^1 \frac{\partial}{\partial \lambda} F(U(\lambda)) d\lambda$
$= (\int_0^1 \frac{\partial F}{\partial U} d\lambda) \cdot (U(x_R) - U(x_L))$
$= \int_0^1 (R \cdot \Lambda \cdot L) d\lambda \cdot (U(x_R) - U(x_L))$
...
$= \int_0^1 (R \cdot exp(log(\Lambda)) \cdot L) d\lambda \cdot (U(x_R) - U(x_L))$
...
$exp\left( F(U(x_R)) - F(U(x_L)) \right) = exp\left( \left( \int_0^1 \frac{\partial F}{\partial U} d\lambda \right) \cdot (U(x_R) - U(x_L)) \right)$ ... or does that go outside the exp?
$= exp\left( \int_0^1 \frac{\partial F}{\partial U} d\lambda \right) (U(x_R) - U(x_L))$
$= exp\left( \int_0^1 R \cdot \Lambda \cdot L d\lambda \right) (U(x_R) - U(x_L))$
$exp(F(U(x_R))) \cdot exp(-F(U(x_L)))
= exp\left( \int_0^1 R \cdot \Lambda \cdot L d\lambda \right) (U(x_R) - U(x_L))$
so the flux $F$ is the log of the connection $\Gamma_x$ that is parallel-propagated along the x-coordiante.
Parallel propagator:
$e(x_R) = e(x_L) \cdot P^{-1} (x_L, x_R)$
$= e(x_L) \cdot exp(-\int_0^1 \Gamma_v d\lambda)$
$e(x_L)^{-1} e(x_R) = exp(-\int_0^1 \Gamma_v d\lambda)$
$exp(log(e(x_L)^{-1} e(x_R))) = exp(-\int_0^1 \Gamma_v d\lambda)$
$exp(log(e(x_R)) - log(e(x_L))) = exp(-\int_0^1 \Gamma_v d\lambda)$
$log(e(x_R)) - log(e(x_L)) = \int_0^1 -\Gamma_v d\lambda$
...substitute...
$F(U(x_R)) \leftrightarrow log(e(x_R))$
$F(U(x_L)) \leftrightarrow log(e(x_L))$
$\frac{\partial F}{\partial U} \cdot (U(x_R) - U(x_L)) \leftrightarrow -\Gamma_v$
...to get...
$F(U(x_R)) - F(U(x_L)) = \int_0^1 \frac{\partial F}{\partial U} (U(x_R) - U(x_L)) d\lambda$
$F(U(x_R)) - F(U(x_L)) = \int_0^1 \frac{\partial F}{\partial U} \frac{\partial U}{\partial \lambda} d\lambda$
$F(U(x_R)) - F(U(x_L)) = \int_0^1 \frac{\partial F}{\partial \lambda} d\lambda$
Then there is Misner, Thorn, Wheeler "Gravitation" exercise 16.6 showing how $\partial_t U + \nabla_{x^i} F^i = 0$ is the same as $\nabla_\nu T^{\mu\nu} = 0$
where $\partial_t \rho + \nabla_{x^i} (\rho v^i) = 0$ becomes $\nabla_\mu T^{t\mu} = 0$
and $\partial_t (\rho v^j) + \nabla_{x^i} (\rho v^i v^j + g^{ij} P) = 0$ becomes $\nabla_\mu T^{j\mu} = 0$