- Understand the concept of Singular Value Decomposition (SVD).
- Decompose a matrix into orthogonal matrices and singular values.
- Learn how to compute SVD step by step.
- Explore applications of SVD in data compression and machine learning.
Definition of Singular Value Decomposition
The **Singular Value Decomposition (SVD)** of an \( m \times n \) matrix \( A \) is a factorization of the form:
\[ A = U \Sigma V^T \]where:
- \( U \) is an \( m \times m \) orthogonal matrix (left singular vectors).
- \( \Sigma \) is an \( m \times n \) diagonal matrix with **singular values** on the diagonal.
- \( V \) is an \( n \times n \) orthogonal matrix (right singular vectors).
Computing SVD Step by Step
To compute \( A = U \Sigma V^T \):
- Find the eigenvalues and eigenvectors of \( A^T A \) to get \( V \).
- Find the eigenvalues and eigenvectors of \( A A^T \) to get \( U \).
- Compute singular values as \( \sigma_i = \sqrt{\lambda_i} \), where \( \lambda_i \) are the eigenvalues of \( A^T A \).
Proof
Since \( A^T A \) is symmetric, it can be diagonalized:
\[ A^T A = V \Lambda V^T \]where \( \Lambda \) contains the eigenvalues of \( A^T A \). Taking the square root of \( \Lambda \) gives \( \Sigma \), leading to:
\[ A = U \Sigma V^T \]Applications of SVD
SVD is widely used in:
- Data Compression: Reducing dimensionality while preserving essential information.
- Image Processing: Approximating images with lower-rank matrices.
- Machine Learning: Reducing feature space in Principal Component Analysis (PCA).
- Signal Processing: Noise filtering and signal reconstruction.
Examples
Example 1: Find the SVD of:
\[ A = \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} \]Step 1: Compute \( A^T A \):
\[ A^T A = \begin{bmatrix} 4 & 3 \\ 0 & -5 \end{bmatrix} \begin{bmatrix} 4 & 0 \\ 3 & -5 \end{bmatrix} = \begin{bmatrix} 25 & -15 \\ -15 & 25 \end{bmatrix} \]Step 2: Compute eigenvalues of \( A^T A \):
\[ \det \begin{bmatrix} 25 - \lambda & -15 \\ -15 & 25 - \lambda \end{bmatrix} = 0 \] \[ (25 - \lambda)^2 - 225 = 0 \] \[ \lambda^2 - 50\lambda + 400 = 0 \] \[ \lambda = 40, 10 \]Step 3: Compute singular values:
\[ \sigma_1 = \sqrt{40}, \quad \sigma_2 = \sqrt{10} \]Step 4: Compute eigenvectors for \( V \), then \( U \), and form the decomposition.
Exercises
- Question 1: Compute the singular values for \( A = \begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix} \).
- Question 2: Find the SVD of \( A = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \).
- Question 3: Explain why \( U \) and \( V \) are always orthogonal in SVD.
- Answer 1: \( \sigma_1 = \sqrt{5}, \sigma_2 = \sqrt{5} \).
- Answer 2: \( U = I \), \( \Sigma = \begin{bmatrix} 3 & 0 \\ 0 & 4 \end{bmatrix} \), \( V = I \).
- Answer 3: Because the eigenvectors of \( A^T A \) and \( A A^T \) form orthonormal bases.