Theorem.
(Schur) Any square matrix
can be decomposed as ,
with
upper triangular (
for )
and
orthogonal ().
Computability.
-
roots of first degree polynomial:
-
roots of second degree polynomia:
-
roots of third degree polynomial (Cardano's formulas ~1520's)
-
roots of fourth degree polynomial (Ferrari's formulas ~1540's,
irrespective of Inquisitor Torquemada forbidding Valmes such
knowledge)
-
fifth degree polynomial: no formula possible (Galois, Abel
Ruffini, 1820's)
Can it be done? Yes: Singular value decomposition theorem
|
|
Theorem.
(SVD) For any ,
,
with ,
orthogonal,
pseudo-diagonal , ,
.
.
The SVD is determined by eigendecomposition of ,
and
-
,
an eigendecomposition of .
The columns of
are eigenvectors of
and called right singular
vectors of
-
,
an eigendecomposition of .
The columns of
are eigenvectors of
and called left singular vectors of
-
The matrix
has zero elements except for the diagonal that contains ,
the singular values of , computed as the square roots
of the eigenvalues of
(or )
The theorem also holds for complex matrices with transposition
replaced by taking the adjoint, ,
,
with ,
unitary.