Singular Value Decomposition/Factorization (SVD) - Reduced SVD

Singular Value Decomposition/Factorization (SVD) - Reduced SVD

Singular Value Decomposition (SVD)


if 𝐴π‘₯ = πœ†π‘₯ & ||π‘₯|| = 1, then:

  • ||𝐴π‘₯|| = ||πœ†π‘₯||
  • ||𝐴π‘₯|| = |πœ†| ||π‘₯||
  • ||𝐴π‘₯|| = |πœ†|

SVD - Definition (Real and/or Complex Matrix)

The SVD of an π‘šβœ•π‘› real and/or complex matrix 𝐴 with rank π‘Ÿ is:

  • 𝐴 = π‘ˆπ›΄π‘‰*

where:

  • π‘ˆ - π‘šβœ•π‘š real or complex unitary matrix
  • 𝛴 - π‘šβœ•𝑛 rectangular matrix with diagonal entries of the first π‘Ÿ singular values of 𝐴 (𝜎1  πœŽ2 ≥ ... ≥ πœŽπ‘Ÿ)
  • 𝑉 - π‘›βœ•π‘› real or complex unitary matrix (𝑉* - complex conjugate of π‘‰)

SVD - Definition (Real Matrix)

If 𝐴 is real, π‘ˆ and π‘‰*=𝑉𝑇 are real orthogonal matrices (where π‘‰π‘‡ is the transpose)

  • 𝐴 = π‘ˆπ›΄π‘‰π‘‡

where:

  • π‘ˆ - orthogonal π‘šβœ•π‘š unitary matrix. normalized eigenvectors of the symmetric matrix π΄π΄π‘‡ (𝑖.𝑒. π‘ˆπ‘‡π‘ˆ=𝐼) columns of π‘ˆ are called the left singular vectors of 𝐴
  • 𝑉 - orthogonal π‘›βœ•π‘› unitary matrix. normalized eigenvectors of the symmetric matrix π΄π‘‡π΄ (𝑖.𝑒. π‘‰π‘‡π‘‰=𝐼) columns of π‘‰ are called the right singular vectors of 𝐴
  • 𝛴 - diagonal matrix of singular values
  • 𝛴 = 𝐷(1/2) where 𝐷 is the diagonal matrix of the eigenvalues of matrix 𝐴𝐴𝑇 and π΄π‘‡π΄

derivation:

 Click here to expand...
  • let {𝐴𝑣1, ..., π΄π‘£π‘Ÿ} be an orthogonal basis for column-space of 𝐴
  • normalize each π΄π‘£π‘– to obtain orthonormal basis {𝑒1, ..., π‘’π‘Ÿ} for column-space of 𝐴
    • 𝑒𝑖 = 𝐴𝑣𝑖 / ||𝐴𝑣𝑖|| = π΄π‘£π‘– / πœŽπ‘–
    • 𝐴𝑣𝑖 πœŽπ‘–𝑒𝑖 # for 1 ≤ 𝑖 ≤ π‘Ÿ

  • now extend {𝑒1, ..., π‘’π‘Ÿ} to an orthonormal basis {𝑒1, ..., π‘’π‘š} of β„π‘š


SVD - 4 Fundamental Subspaces

The Invertible Matrix Theorem

let 𝐴 be π‘›βœ•π‘› matrix. then the following statements are equivalent to the statement that 𝐴 is an invertible matrix:

  • (πΆπ‘œπ‘™π΄)βŸ‚ = {0}
  • (N𝑒𝑙𝑙𝐴)βŸ‚ = ℝ𝑛
  • π‘…π‘œπ‘€π΄ = ℝ𝑛
  • 𝐴 has 𝑛 non-zero singular values

SVD - Reduced SVD & Pseudo Inverse (Moore-Penrose Inverse)

π‘ˆπ‘Ÿπ·π‘‰π‘Ÿπ‘‡ is called reduced SVD

pseudoinverse (Moore-Penrose Inverse) of 𝐴:

  • 𝐴† = π‘‰π‘Ÿπ·-1π‘ˆπ‘Ÿπ‘‡

application to least-squares solution

given the equation 𝐴π‘₯ = 𝑏, use pseudoinverse of 𝐴

  • π‘₯Μ‚ = 𝐴𝑏
  • π‘₯Μ‚ = π‘‰π‘Ÿπ·-1π‘ˆπ‘Ÿπ‘‡π‘

then from SVD:

  • 𝐴π‘₯Μ‚ = π‘ˆπ‘Ÿπ·π‘‰π‘Ÿπ‘‡π‘‰π‘Ÿπ·-1π‘ˆπ‘Ÿπ‘‡π‘
  • 𝐴π‘₯Μ‚ = π‘ˆπ‘Ÿπ‘ˆπ‘Ÿπ‘‡π‘

SVD - Intuition (As Linear Transformations)

When 𝐴 is a Square π‘šβœ•π‘š Real Matrix

  • 𝐴 is a linear transformation of space β„π‘š
  • SVD breaks down any invertible π‘€ into 3 geometric transformations:
    • 𝑉* - rotation or reflection
    • 𝐷 - scaling
    • π‘ˆ - rotation or reflection
  • π‘ˆ & π‘‰* can be chosen as π‘šβœ•π‘š real matrices. Therefore, unitary becomes orthogonal and therefore both π‘ˆ & π‘‰* can be thought as rotations and/or reflections
  • if the determinant of 𝐴 is:
    • positive - π‘ˆ & π‘‰* is either both reflections or both rotations
    • negative - exactly one is reflection
    • zero - anything goes

When 𝐴 is a Rectangular π‘šβœ•𝑛 Real Matrix

  • 𝐴 is a linear transformation of space ℝ𝑛 to β„π‘š
  • SVD breaks down any invertible 𝐴 into 3 geometric transformations:
    • 𝑉* - rotation or reflection in space β„π‘›
    • 𝐷 - scaling from β„π‘› to β„π‘š
    • π‘ˆ - rotation or reflection in space β„π‘š