Skip to main content

Singular Value Decomposition

  • Understand the concept of Singular Value Decomposition (SVD).
  • Decompose a matrix into orthogonal matrices and singular values.
  • Learn how to compute SVD step by step.
  • Explore applications of SVD in data compression and machine learning.

Definition of Singular Value Decomposition

The **Singular Value Decomposition (SVD)** of an \( m \times n \) matrix \( A \) is a factorization of the form:

\[ A = U \Sigma V^T \]

where:

  • \( U \) is an \( m \times m \) orthogonal matrix (left singular vectors).
  • \( \Sigma \) is an \( m \times n \) diagonal matrix with **singular values** on the diagonal.
  • \( V \) is an \( n \times n \) orthogonal matrix (right singular vectors).

Computing SVD Step by Step

To compute \( A = U \Sigma V^T \):

  1. Find the eigenvalues and eigenvectors of \( A^T A \) to get \( V \).
  2. Find the eigenvalues and eigenvectors of \( A A^T \) to get \( U \).
  3. Compute singular values as \( \sigma_i = \sqrt{\lambda_i} \), where \( \lambda_i \) are the eigenvalues of \( A^T A \).

Proof

Since \( A^T A \) is symmetric, it can be diagonalized:

\[ A^T A = V \Lambda V^T \]

where \( \Lambda \) contains the eigenvalues of \( A^T A \). Taking the square root of \( \Lambda \) gives \( \Sigma \), leading to:

\[ A = U \Sigma V^T \]

Applications of SVD

SVD is widely used in:

  • Data Compression: Reducing dimensionality while preserving essential information.
  • Image Processing: Approximating images with lower-rank matrices.
  • Machine Learning: Reducing feature space in Principal Component Analysis (PCA).
  • Signal Processing: Noise filtering and signal reconstruction.

Examples

Example 1: Find the SVD of:

\[ A = \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} \]

Step 1: Compute \( A^T A \):

\[ A^T A = \begin{bmatrix} 4 & 3 \\ 0 & -5 \end{bmatrix} \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} = \begin{bmatrix} 25 & -15 \\ -15 & 25 \end{bmatrix} \]

Step 2: Compute eigenvalues of \( A^T A \):

\[ \det \begin{bmatrix} 25 - \lambda & -15 \\ -15 & 25 - \lambda \end{bmatrix} = 0 \] \[ (25 - \lambda)^2 - 225 = 0 \] \[ \lambda^2 - 50\lambda + 400 = 0 \] \[ \lambda = 40, 10 \]

Step 3: Compute singular values:

\[ \sigma_1 = \sqrt{40}, \quad \sigma_2 = \sqrt{10} \]

Step 4: Compute eigenvectors for \( V \), then \( U \), and form the decomposition.

Exercises

  • Question 1: Compute the singular values for \( A = \begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix} \).
  • Question 2: Find the SVD of \( A = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \).
  • Question 3: Explain why \( U \) and \( V \) are always orthogonal in SVD.

  • Answer 1: \( \sigma_1 = \sqrt{5}, \sigma_2 = \sqrt{5} \).
  • Answer 2: \( U = I \), \( \Sigma = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \), \( V = I \).
  • Answer 3: Because the eigenvectors of \( A^T A \) and \( A A^T \) form orthonormal bases.

This Week's Best Picks from Amazon

Please see more curated items that we picked from Amazon here .

Popular posts from this blog

LU Decomposition

LU Decomposition: A Step-by-Step Guide LU Decomposition, also known as LU Factorization, is a method of decomposing a square matrix into two triangular matrices: a lower triangular matrix L and an upper triangular matrix U . This is useful for solving linear equations, computing determinants, and inverting matrices efficiently. What is LU Decomposition? LU Decomposition expresses a matrix A as: \[ A = LU \] where: L is a lower triangular matrix with ones on the diagonal. U is an upper triangular matrix. Step-by-Step Process Consider the matrix: \[ A = \begin{bmatrix} 2 & 3 & 1 \\ 4 & 7 & 3 \\ 6 & 18 & 5 \end{bmatrix} \] Step 1: Initialize L as an Identity Matrix Start with an identity matrix for \( L \): \[ L = \begin{bmatrix} 1 & 0 & 0 \\ 0 ...

Gaussian Elimination: A Step-by-Step Guide

Gaussian Elimination: A Step-by-Step Guide Gaussian Elimination is a systematic method for solving systems of linear equations. It works by transforming a given system into an equivalent one in row echelon form using a sequence of row operations. Once in this form, the system can be solved efficiently using back-substitution . What is Gaussian Elimination? Gaussian elimination consists of two main stages: Forward Elimination: Convert the system into an upper triangular form. Back-Substitution: Solve for unknowns starting from the last equation. Definition of a Pivot A pivot is the first nonzero entry in a row when moving from left to right. Pivots are used to eliminate the elements below them, transforming the system into an upper triangular form. Step-by-Step Example Consider the system of equations: \[ \begin{aligned} 2x + 3y - z &= 5 \\ 4x + y...

Vector Spaces and Linear Transformation

Vector Spaces and Linear Transformations A vector space is a set of vectors that satisfies specific properties under vector addition and scalar multiplication. Definition of a Vector Space A set \( V \) is called a vector space over a field \( \mathbb{R} \) (real numbers) if it satisfies the following properties: Closure under addition: If \( \mathbf{u}, \mathbf{v} \in V \), then \( \mathbf{u} + \mathbf{v} \in V \). Closure under scalar multiplication: If \( \mathbf{v} \in V \) and \( c \in \mathbb{R} \), then \( c\mathbf{v} \in V \). Associativity: \( (\mathbf{u} + \mathbf{v}) + \mathbf{w} = \mathbf{u} + (\mathbf{v} + \mathbf{w}) \). Commutativity: \( \mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u} \). Existence of a zero vector: There exists a vector \( \mathbf{0} \) such that \( \mathbf{v} + \mathbf{0} = \mathbf{v} \). Existence of additive inverses: For eac...