MATH661 HW05 - Linear algebra analytical practice

Posted: 10/04/23

Due: 10/11/23, 11:59PM

While working on computational aspects in P01, homework will concentrate on analytical properties.

1Track 1

  1. Let λ,μ be distinct eigenvalues of 𝑨 symmetric, i.e., λμ, 𝑨=𝑨T, 𝑨𝒙=λ𝒙, 𝑨𝒚=μ𝒚. Show that 𝒙,𝒚 are orthogonal.

    Solution. Compute (𝑨𝒙)T=𝒙T𝑨T=𝒙T𝑨=λ𝒙T𝒙T𝑨𝒚=λ𝒙T𝒚. Multiply 𝑨𝒚=μ𝒚 with 𝒙T to obtain 𝒙T𝑨𝒚=μ𝒙T𝒚. Subtracting gives

    0=(λ-μ)𝒙T𝒚𝒙T𝒚=0

    since λμ.

  2. Consider

    𝑨=[ -i -i 0 -i i 0 0 0 1 ].
    1. Is 𝑨 normal?

    2. Is 𝑨 self-adjoint?

    3. Is 𝑨 unitary?

    4. Find the eigenvalues and eigenvectors of 𝑨.

    Solution.
    1. Observe block structure

      𝑨=[ 𝑩 0 0 1 ],

      such that 𝑨 is normal if 𝑩 normal. Note that

      𝑩=[ i i i -i ]=-𝑩,

      such that 𝑩𝑩-𝑩𝑩=-𝑩2+𝑩2=𝟎, hence 𝑩 is normal.

      Note: Always look for any special properties of a matrix before attempting calculations.

    2. No, since 𝑩=-𝑩.

    3. No, since 𝑩𝑩=-𝑩2.

    4. Block structure of 𝑨 gives λ3=1, 𝒙3=𝒆3, with remaining eigenvectors obtained by those of 𝑩

      𝑩𝒒=μ𝒒μ=±i2,𝑸=[ 1 -1 2-1 2+1 ]
      𝑿=[ 1 -1 0 2-1 2+1 0 0 0 1 ],𝚲=diag(i2,-i2,1).
  3. Find the eigenvalues and eigenvectors of the matrix 𝑹3×3 expressing rotation around the z-axis (unit vector 𝒆3=[ 0 0 1 ]T).

    Solution. The rotation map 𝒇:33 is linear

    𝒇(α𝒖+β𝒗)=α𝒇(𝒖)+β𝒇(𝒗)

    and the matrix 𝑹 representing this rotation has columns given by the image of a basis set

    𝑹=[ 𝒇(𝒆1) 𝒇(𝒆2) 𝒇(𝒆3) ]=[ cosθ𝒆1+sinθ𝒆2 -sinθ𝒆1+cosθ𝒆2 𝒆3 ]=𝑰[ cosθ -sinθ 0 sinθ cosθ 0 0 0 1 ].

    One eigenpair is

    λ3=1,𝒙3=𝒆3,

    since 𝑹𝒆3=𝒆3; a rotation does not change a vector along the axis of rotation. The characteristic polynomial 𝒑3(λ)=(λ-1)(λ2-2cosθ+1) also has roots λ1=e-iθ, λ2=eiθ with associated eigenvectors

    𝒙1=[ 1 i 0 ]=𝒆1+i𝒆2,𝒙2=[ i 1 0 ]=i𝒆1+𝒆2.

    The above can be verified in an insightful manner by recalling that rotation does not change vector lengths (isometric mapping) hence 𝑹 is orthogonal, and 𝑹-1=𝑹T, such that the eigenvalue relation 𝑹𝒙=λ𝒙 leads to

    𝒙=λ𝑹-1𝒙=λ𝑹T𝒙.

    Verify that 𝒙1 is an eigenvector by computing,

    λ1𝑹T𝒙1=e-iθ[ cosθ𝒆1T+sinθ𝒆2T -sinθ𝒆1T+cosθ𝒆2T 𝒆3T ](𝒆1+i𝒆2)=e-iθ[ cosθ+isinθ -sinθ+icosθ 0 ]=[ 1 i 0 ]=𝒙1.

    A similar verification can be done for 𝒙2. Note how thinking of 𝑹 as a collection of column vectors leads to a concise, elegant solution of the eigenvalue problem. Also, notice that for rotation within the x1x2 plane the eigenvectors are outside the real x1x2 plane and that all eigenvalues are of unit absolute value since the rotation is isometric.

  4. Find the eigenvalues and eigenvectors of the matrix 𝑺3×3 expressing rotation around the axis with unit vector 𝒍=13[ 1 1 1 ]T.

    Solution. This is the same as the above problem, but in a different basis, one in which the axis of rotation is [ 1 1 1 ]T instead of [ 0 0 1 ]T. As before, one eigenpair is λ3=1, with 𝒙3=𝒍 along the rotation axis. Above, the other two eigenvectors were found using unit vectors perpendicular to the rotation axis. One can apply Gram-Schmidt to find 𝒋,𝒌 orthonormal to 𝒍 or simply observe that

    [ 2𝒋 6𝒌 3𝒍 ]=[ 1 1 1 -1 1 1 0 -2 1 ]

    defines an orthogonal matrix 𝑩=[ 𝒋 𝒌 𝒍 ]. Rotating some vector 𝒙 around the 𝒍 axis is straightforward in the 𝑩 basis, so set 𝒙=𝑰𝒙=𝑩𝒖𝒖=𝑩T𝒙. Read this to state: the vector 𝒙 has coordinates 𝒙 in the 𝑰 basis, but coordinates 𝒖 in the 𝑩 basis. In the 𝑩 basis the rotation matrix has columns

    𝑺=[ 𝒇(𝒋) 𝒇(𝒌) 𝒇(𝒍) ]=[ cosθ𝒋+sinθ𝒌 -sinθ𝒋+cosθ𝒌 𝒍 ]=𝑩[ cosθ -sinθ 0 sinθ cosθ 0 0 0 1 ]=𝑩𝑹.

    The result of the rotation is

    𝒚=𝑺𝒖=𝑩𝑹𝑩T𝒙,

    where 𝑺=𝑩𝑹𝑩T is the rotation matrix. The eigendecomposition of 𝑹 is known from above problem 𝑹=𝑸𝚲𝑸 furnishing the eigendecomposition of 𝑺

    𝑺=𝑩𝑹𝑩T=𝑩𝑹𝑩=𝑩𝑸𝚲𝑸𝑩=𝑼𝚲𝑼.

    The eigenvalues of 𝑺 are the same of those of 𝑹 (e±iθ,1) and the eigenvectors are

    𝑼=𝑩𝑸.

  5. Compute sin(𝑨t) for

    𝑨=[ 3 -9 2 -6 ].

    Solution. The matrix is singular hence one eigenvalue is λ1=0 with eigenvector

    𝒙1=[ 3 1 ].

    The other eigenpair is λ2=-3,

    𝒙2=[ 3 2 ],

    and 𝑨 with distinct eigenvalues is diagonalizable, 𝑨𝑿=𝑿𝚲𝑨=𝑿𝚲𝑿-1. Powers of 𝑨 are given by

    𝑨k=𝑿𝚲k𝑿-1.

    From the Euler formula eiθ=cosθ+isinθ find

    sinθ=eiθ-e-iθ2i,

    which extended to matrix arguments gives

    sin(𝑨t)=12i[exp(it𝑨)-exp(-it𝑨)].

    From the power series 𝒆t=1+t+t2/2!+ obtain

    exp(it𝑨)=𝑰+it𝑨+(it𝑨)2/2!+=𝑿exp(it𝚲)𝑿-1,

    leading to

    sin(𝑨t)=𝑿sin(t𝚲)𝑿-1=[ 3 3 1 2 ][ 0 0 0 -sin(3t) ][ 3 3 1 2 ]-1=[ -2 3 -4/3 2 ]sin(3t).

  6. Compute cos(𝑨t) for

    𝑨=[ 5 -4 2 -1 ].

    Solution. As above, from eigendecomposition

    𝑨=𝑿𝚲𝑿-1=[ 2 1 1 1 ][ 3 0 0 1 ][ 1 -1 -1 2 ],

    obtain

    cos(𝑨t)=𝑿cos(𝚲t)𝑿-1=[ 2 1 1 1 ][ cos(3t) 0 0 cos(t) ][ 1 -1 -1 2 ]

  7. Compute the SVD of

    𝑨=[ 1 -2 -3 6 ]

    by finding the eigenvalues and eigenvectors of 𝑨𝑨T, 𝑨T𝑨.

    Solution. With the SVD 𝑨=𝑼𝚺𝑽T, the matrix 𝑴=𝑨𝑨T=𝑼𝚺2𝑼T has eigendecomposition

    𝑴=110[ -1 3 3 1 ][ 50 0 0 0 ]110[ -1 3 3 1 ],

    hence

    𝑼=110[ -1 3 3 1 ],𝚺=[ 50 0 0 0 ].

    From 𝑵=𝑨T𝑨=𝑽𝚺2𝑽T with eigendecomposition

    𝑵=15[ -1 2 2 1 ][ 50 0 0 0 ]15[ -1 2 2 1 ],

    obtain

    𝑽=15[ -1 2 2 1 ].

  8. Find the eigenvalues and eigenvectors of 𝑨m×m with elements aij=1 for all 1i,jm. Hint: start with m=1,2,3 and generalize.

    Solution. Note that rank(𝑨)=1 hence 𝑨 has λ2==λm=0 an m-1 repeated zero root implying

    p𝑨(λ)=det(λ𝑰-𝑨)=| λ-1 -1 -1 -1 λ-1 -1 -1 -1 λ-1 |=λm-1(λ-λ1).

    Adding all rows gives a row of λ-m such that λ1=m leads to a null determinant with associated eigenvector 𝒙1=[ 1 1 1 ]T. The other eigenvectros are of the form 𝒙j=[ 1 0 -1 0 ]T with the one in the ith position.

2Track 2

  1. Prove that 𝑨m×m is normal if and only if it has m orthonormal eigenvectors.

    Solution. Apply Schur theorem 𝑨=𝑼𝑻𝑼 such that 𝑨 normal 𝑨𝑨=𝑨𝑨 implies 𝑻𝑻=𝑻𝑻, which implies that 𝑻 is diagonal, as proven by induction

    𝑻=[ t 𝒛 𝟎 𝑺 ],𝑻𝑻=[ t 𝒛 𝟎 𝑺 ][ t 𝟎 𝒛 𝑺 ]=[ tt+𝒛𝒛 𝒛𝑺 𝑺𝒛 𝑺𝑺 ],

    𝑻𝑻=[ t 𝟎 𝒛 𝑺 ][ t 𝒛 𝟎 𝑺 ]=[ tt t𝒛 𝒛t 𝑺𝑺 ],

    since tt+𝒛𝒛=tt implies 𝒛=𝟎.

  2. Prove that 𝑨m×m symmetric has a repeated eigenvalue if and only if it commutes with a non-zero skew-symmetric matrix 𝑩.

    Solution. 𝑨 symmetric is normal and has an orthogonal eigendecomposition 𝑨=𝑸𝚲𝑸T, that states that in the 𝑸 basis the effect of 𝑨 is simple scaling of components by the eigenvalues. From 𝑩 that commutes with 𝑨, construct 𝑪=𝑸T𝑩𝑸=-𝑪T (to work in 𝑸 basis), and find

    𝚲𝑪-𝑪𝚲=𝑸T(𝑨𝑩-𝑩𝑨)𝑸=𝟎𝚲𝑪=𝑪𝚲.

    A repeated eigenvalue is a statement about the components of 𝚲. Equality of the (i,j) components of 𝚲𝑪 and 𝑪𝚲 implies

    λicij=cijλj.

    From above if there is at least one non-zero cij element of 𝑪 then λi=λj. For the converse choose cij=1=-cji at i,j for which λi=λj.

  3. Prove that every positive definite matrix 𝑲m×m has a unique square root 𝑩, 𝑩 positive definite and 𝑩2=𝑲.

    Solution. 𝑲 positive definite implies 𝒒T𝑲𝒒>0 for all 𝒒 and 𝑲=𝑲T. 𝑲 symmetric admits an orthogonal eigendecomposition 𝑲=𝑸𝚲𝑸T with eigenvalues λj>0. Define 𝑩=𝑸𝚲1/2𝑸T. where diagonal elements of 𝚲1/2 are λj.

  4. Find all positive definite orthogonal matrices.

    Solution. In addition to above properties, 𝑲𝑲T=𝑸𝚲2𝑸T=𝑰𝚲2=𝑰, so possible elements of 𝚲 are ±1. Imposing 𝒆jT𝑸T𝑲𝑸𝒆j>0 then leads to 𝑲=𝑰.

  5. Find the eigenvalues and eigenvectors of a Householder reflection matrix.

    Solution. Write

    𝑯=𝑰-2𝒒𝒒Tm×m

    with 𝒒 the unit vector normal to the reflection hyperplane and note that 𝑯 symmetric has an orthogonal eigendecomposition 𝑯=𝑼𝚲𝑼T. Since 𝑯 is isometric eigenvalues λ=±1. From

    𝑯𝒒=𝒒-2𝒒(𝒒T𝒒)=-𝒒

    find eigenpair (-1,𝒒). For any 𝒗𝒒 obtain

    𝑯𝒗=𝒗

    so (1,𝒗) are the remaining m-1 eigenpairs.

  6. Find the eigenvalues and eigenvectors of a Givens rotation matrix.

    Solution. The rotation matrix

    𝑹(j,k,θ)=𝑰+(cosθ-1)(𝒆j𝒆j+𝒆k𝒆k)-sinθ(𝒆j𝒆k-𝒆k𝒆j)

    has m-2 eigenpairs (λ,𝒆l) for lj, lk. The other eigenpairs are

    λj=e-iθ,𝒙j=i𝒆j+𝒆k;λk=eiθ,𝒙k=𝒆j+i𝒆k.

    Verify

    𝑹𝒙j=i𝒆j+𝒆k+(cosθ-1)(𝒆j𝒆j+𝒆k𝒆k)(i𝒆j+𝒆k)-sinθ(𝒆j𝒆k-𝒆k𝒆j)(i𝒆j+𝒆k)

    𝑹𝒙j=i𝒆j+𝒆k+(cosθ-1)(i𝒆j+𝒆k)-sinθ(𝒆j-i𝒆k)=(icosθ-sinθ)𝒆j+(cosθ+isinθ)𝒆k=e-iθ𝒙j.

  7. Prove or state a counterexample: If all eigenvalues of 𝑨 are zero then 𝑨=𝟎.

    Solution. Counterexample

    𝑨=[ 0 1 0 0 ],

    a Jordan block with 0 diagonal.

  8. Prove: A hermitian matrix is unitarily diagonalizable and its eigenvalues are real.

    Solution. Apply Schur theorem 𝑨=𝑸𝑻𝑸=𝑨=(𝑸𝑻𝑸)𝑸(𝑻-𝑻)𝑸=𝟎𝑻=𝑻. Since 𝑻 triangular is equal to its adjoint it must be diagonal with real elements 𝑻=𝚲, 𝑨=𝑸𝚲𝑸.