![]()
MATH347 L28: Course review
|
What is linear algebra and why is to so important to so many applications?
Basic operations: defined to express linear combination
Linear operators, Fundamental Theorem of Linear Algebra (FTLA)
Factorizations: more convenient expression of linear combination
Solving linear systems (change of basis)
Best 2-norm approximation: least squares
Exposing the structure of a linear operator between the same sets through eigendecomposition
Exposing the structure of a linear operator between different sets through the SVD
Applications: any type of correlation
![]()
What is linear algebra, and why is it important?
|
Science acquires and organizes knowledge into theories that can be verified by quantified tests. Mathematics furnishes the appropriate context through rigorous definition of .
Most areas of science require groups of numbers to describe an observation. To organize knowledge rules on how such groups of numbers may be combined are needed. Mathematics furnishes the concept of a vector space
formal definition of a single number: scalar,
formal definition of a group of numbers: vector,
formal definition of a possible way to combine vectors:
Algebra is concerned with precise definition of ways to combine mathematical objects, i.e., to organize more complex knowledge as a sequence of operations on simpler objects
Linear algebra concentrates on one particular operation: the linear combination
It turns out that a complete theory can be built around the linear combination, and this leads to the many applications linear algebra finds in all branches of knowledge.
![]()
Basic operations, concepts
|
Group vectors as column vectors into matrices
Define matrix-vector multiplication to express the basic linear combination operation
Introduce a way to switch between column and row storage through the transposition operation . ,
Transform between one set of basis vectors and another
Linear independence establishes when a vector cannot be described as a linear combination of other vectors, i.e., if the only way to satisfy is for , then the vectors are linearly independent
The span is the set of all vectors is reachable by linear combination of
The set of vectors is a basis of a vector space if , and are linearly independent
The number of vectors in a basis is the dimension of a vector space.
![]()
Characterization of a linear operator
|
Any linear operator , can be characterized by a matrix
For each matrix there exist four fundamental subspaces:
Column space, , the part of reachable by linear combination of columns of
Left null space, , the part of not reachable by linear combination of columns of
Row space, , the part of reachable by linear combination of rows of
Null space, , the part of not reachable by linear combination of rows of
The fundamental theorem of linear algebra (FTLA) states
![]()
FTLA
|
![]()
Factorizations
|
, (or with a permutation matrix) Gaussian elimination, solving linear systems. Given , , find such that by:
Factorize,
Solve lower triangular system by forward substitution
Solve upper triangular system by backward substitution
, (or with a permutation matrix) Gram-Schmidt, solving least squares problem. Given , , , solve by:
Factorize,
Solve upper triangular system by forward substitution
, eigendecomposition of ( invertible if is normal, i.e., )
, orthogonal eigendecomposition when (normal)
, Schur decomposition of , orthogonal matrix, triangular matrix, decomposition always exists
, Singular value decomposition of , orthogonal matrices, , decomposition always exists
![]()
Solving
by Gaussian elimination
|
Gaussian elimination produces a sequence matrices similar to
Step produces zeros underneath diagonal position
Step can be represented as multiplication by matrix
All steps produce an upper triangular matrix
With permutations (Matlab [L,U,P]=lu(A), A=P'*L*U)
![]()
Matrix formulation of Gaussian elimination
|
With known -factorization:
To solve :
Carry out -factorization:
Solve by forward substitution to find
Solve by backward substitution
FLOP = floating point operation = one multiplication and one addition
Operation counts: how many FLOPS in each step?
Each costs FLOPS. Overall
Forward substitution step costs flops
Backward substitution cost is identical /
![]()
Gram-Schmidt as
|
Orthonormalization of columns of is also a factorization
Operation count:
costs FLOPS
There are components in , Overall cost
With permutations (Matlab [Q,R,P]=qr(A) )
![]()
Solving linear ,
systems by
|
With known -factorization:
To solve :
Carry out -factorization:
Compute
Solve by backward substitution
Find
Operation counts: how many FLOPS in each step?
-factorization
Compute ,
Backward substitution
![]()
Least squares
|
Consider approximating by linear combination of vectors,
Make approximation error as small as possible
Error is measured in the 2-norm the least squares problem (LSQ)
Solution is the projection of onto
The vector is found by back-substitution from
![]()
The eigenvalue problem
|
For square matrix find non-zero vectors whose directions are not changed by multiplication by , , is scalar, the eigenvalue problem.
Consider the eigenproblem for . Rewrite as
Since , a solution to eigenproblem exists only if is singular.
singular implies ,
Investigate form of
, an -degree polynomial in , characteristic polynomial of , with roots, , the eigenvalues of
![]()
Eigenvalue problem in matrix form
|
, eigenvalue problem () in matrix form:
is the eigenvector matrix, is the (diagonal) eigenvalue matrix
If column vectors of are linearly independent, then is invertible
the eigendecomposition of (compare to , )
Rule “determinant of product = product of determinants” implies
![]()
Algebraic, geometric multiplicity
|
Definition
Example. has two single roots , and a repeated root . The eigenvalue has an algebraic multiplicity of 2
Definition
Definition
Theorem. A matrix is diagonalizable if the geometric multiplicity of each eigenvalue is equal to the algebraic multiplicity of that eigenvalue.
![]()
SVD
|
SVD of reveals: , bases for
![]()
Computing the SVD
|
From obtain
Singular vectors are eigenvectors of
Singular vectors are eigenvectors of
SVD relevant for correlations: measurements of at
Choose origin such that , construct covariance matrix
Eigenvectors of are singular vectors of image compression, graph partition, social networks, data analysis