Skip to main content

This Week's Best Picks from Amazon

Please see more curated items that we picked from Amazon here .

Singular Value Decomposition

  • Understand the concept of Singular Value Decomposition (SVD).
  • Decompose a matrix into orthogonal matrices and singular values.
  • Learn how to compute SVD step by step.
  • Explore applications of SVD in data compression and machine learning.

Definition of Singular Value Decomposition

The **Singular Value Decomposition (SVD)** of an \( m \times n \) matrix \( A \) is a factorization of the form:

\[ A = U \Sigma V^T \]

where:

  • \( U \) is an \( m \times m \) orthogonal matrix (left singular vectors).
  • \( \Sigma \) is an \( m \times n \) diagonal matrix with **singular values** on the diagonal.
  • \( V \) is an \( n \times n \) orthogonal matrix (right singular vectors).

Computing SVD Step by Step

To compute \( A = U \Sigma V^T \):

  1. Find the eigenvalues and eigenvectors of \( A^T A \) to get \( V \).
  2. Find the eigenvalues and eigenvectors of \( A A^T \) to get \( U \).
  3. Compute singular values as \( \sigma_i = \sqrt{\lambda_i} \), where \( \lambda_i \) are the eigenvalues of \( A^T A \).

Proof

Since \( A^T A \) is symmetric, it can be diagonalized:

\[ A^T A = V \Lambda V^T \]

where \( \Lambda \) contains the eigenvalues of \( A^T A \). Taking the square root of \( \Lambda \) gives \( \Sigma \), leading to:

\[ A = U \Sigma V^T \]

Applications of SVD

SVD is widely used in:

  • Data Compression: Reducing dimensionality while preserving essential information.
  • Image Processing: Approximating images with lower-rank matrices.
  • Machine Learning: Reducing feature space in Principal Component Analysis (PCA).
  • Signal Processing: Noise filtering and signal reconstruction.

Examples

Example 1: Find the SVD of:

\[ A = \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} \]

Step 1: Compute \( A^T A \):

\[ A^T A = \begin{bmatrix} 4 & 3 \\ 0 & -5 \end{bmatrix} \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} = \begin{bmatrix} 25 & -15 \\ -15 & 25 \end{bmatrix} \]

Step 2: Compute eigenvalues of \( A^T A \):

\[ \det \begin{bmatrix} 25 - \lambda & -15 \\ -15 & 25 - \lambda \end{bmatrix} = 0 \] \[ (25 - \lambda)^2 - 225 = 0 \] \[ \lambda^2 - 50\lambda + 400 = 0 \] \[ \lambda = 40, 10 \]

Step 3: Compute singular values:

\[ \sigma_1 = \sqrt{40}, \quad \sigma_2 = \sqrt{10} \]

Step 4: Compute eigenvectors for \( V \), then \( U \), and form the decomposition.

Exercises

  • Question 1: Compute the singular values for \( A = \begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix} \).
  • Question 2: Find the SVD of \( A = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \).
  • Question 3: Explain why \( U \) and \( V \) are always orthogonal in SVD.

  • Answer 1: \( \sigma_1 = \sqrt{5}, \sigma_2 = \sqrt{5} \).
  • Answer 2: \( U = I \), \( \Sigma = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \), \( V = I \).
  • Answer 3: Because the eigenvectors of \( A^T A \) and \( A A^T \) form orthonormal bases.

Popular posts from this blog

Gaussian Elimination: A Step-by-Step Guide

Gaussian Elimination: A Step-by-Step Guide Gaussian Elimination is a systematic method for solving systems of linear equations. It works by transforming a given system into an equivalent one in row echelon form using a sequence of row operations. Once in this form, the system can be solved efficiently using back-substitution . What is Gaussian Elimination? Gaussian elimination consists of two main stages: Forward Elimination: Convert the system into an upper triangular form. Back-Substitution: Solve for unknowns starting from the last equation. Definition of a Pivot A pivot is the first nonzero entry in a row when moving from left to right. Pivots are used to eliminate the elements below them, transforming the system into an upper triangular form. Step-by-Step Example Consider the system of equations: \[ \begin{aligned} 2x + 3y - z &= 5 \\ 4x + y...

Eigenvalues and Eigenvectors

Lesson Objectives ▼ Understand the definition of eigenvalues and eigenvectors. Learn how to compute eigenvalues and eigenvectors. Explore the characteristic equation. Interpret eigenvalues and eigenvectors in transformations. Understand diagonalization and its applications. Lesson Outline ▼ Definition of Eigenvalues and Eigenvectors Characteristic Equation Diagonalization Examples Definition of Eigenvalues and Eigenvectors In a linear transformation, an eigenvector is a nonzero vector that only changes by a scalar factor when the transformation is applied. For a square matrix \( A \), an eigenvector \( \mathbf{v} \) and its corresponding eigenvalue \( \lambda \) satisfy: \[ A\mathbf{v} = \lambda \mathbf{v} \] \( A \) is an \( n \times n \) matrix. \( \mathbf{v} \neq 0 \) is an eigenvector. \( \lambda \) is a scalar eigenvalue. Characteristic Equation ...