Skip to main content

Inner Product Spaces

  • Understand the definition of inner product spaces.
  • Compute norms, angles, and distances in inner product spaces.
  • Explore the concept of orthogonality.
  • Learn the Gram-Schmidt orthogonalization process.
  • Use inner products for projections and least squares approximation.

Definition of Inner Product Spaces

An inner product space is a vector space with an operation called the **inner product**, which measures similarity between vectors.

For a real vector space, the inner product of two vectors \( \mathbf{u}, \mathbf{v} \) is:

\[ \langle \mathbf{u}, \mathbf{v} \rangle = u_1 v_1 + u_2 v_2 + \dots + u_n v_n \]

For a complex vector space, the inner product is:

\[ \langle \mathbf{u}, \mathbf{v} \rangle = u_1 \overline{v_1} + u_2 \overline{v_2} + \dots + u_n \overline{v_n} \]

Norms, Angles, and Orthogonality

The **norm** (or length) of a vector is:

\[ \|\mathbf{v}\| = \sqrt{\langle \mathbf{v}, \mathbf{v} \rangle} \]

The **angle** between two vectors is given by:

\[ \cos\theta = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\|\mathbf{u}\| \|\mathbf{v}\|} \]

Two vectors are **orthogonal** if:

\[ \langle \mathbf{u}, \mathbf{v} \rangle = 0 \]

Gram-Schmidt Orthogonalization

The **Gram-Schmidt process** converts a set of linearly independent vectors into an orthonormal set.

  • Start with linearly independent vectors \( \mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n \).
  • Compute an orthogonal basis \( \mathbf{q}_1, \mathbf{q}_2, \dots, \mathbf{q}_n \) using:
\[ \mathbf{q}_1 = \frac{\mathbf{v}_1}{\|\mathbf{v}_1\|} \] \[ \mathbf{q}_2 = \frac{\mathbf{v}_2 - \langle \mathbf{q}_1, \mathbf{v}_2 \rangle \mathbf{q}_1}{\|\mathbf{v}_2 - \langle \mathbf{q}_1, \mathbf{v}_2 \rangle \mathbf{q}_1\|} \]

Proof

The Gram-Schmidt process constructs orthonormal vectors by removing the components of earlier vectors.

Starting with \( \mathbf{q}_1 = \frac{\mathbf{v}_1}{\|\mathbf{v}_1\|} \), we compute \( \mathbf{q}_2 \):

\[ \mathbf{q}_2 = \mathbf{v}_2 - \frac{\langle \mathbf{q}_1, \mathbf{v}_2 \rangle}{\langle \mathbf{q}_1, \mathbf{q}_1 \rangle} \mathbf{q}_1 \]

This ensures \( \langle \mathbf{q}_1, \mathbf{q}_2 \rangle = 0 \), making them orthogonal.

Projections and Least Squares

The **projection** of a vector \( \mathbf{v} \) onto \( \mathbf{u} \) is:

\[ \text{proj}_{\mathbf{u}} \mathbf{v} = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\langle \mathbf{u}, \mathbf{u} \rangle} \mathbf{u} \]

The **least squares solution** to \( A\mathbf{x} = \mathbf{b} \) minimizes:

\[ \| A\mathbf{x} - \mathbf{b} \| \]

By projecting \( \mathbf{b} \) onto the column space of \( A \):

\[ A^T A \mathbf{x} = A^T \mathbf{b} \]

Examples

Example 1: Compute the inner product of \( \mathbf{u} = (1,2,3) \) and \( \mathbf{v} = (4,-1,2) \).

\[ \langle \mathbf{u}, \mathbf{v} \rangle = (1)(4) + (2)(-1) + (3)(2) = 4 - 2 + 6 = 8 \]

Exercises

  • Question 1: Find the norm of \( \mathbf{v} = (3,4) \).
  • Question 2: Compute the inner product of \( \mathbf{a} = (1,2) \) and \( \mathbf{b} = (3,4) \).
  • Question 3: Check if \( \mathbf{x} = (1,2) \) and \( \mathbf{y} = (-2,1) \) are orthogonal.
  • Answer 1: \( \|\mathbf{v}\| = \sqrt{3^2 + 4^2} = 5 \).
  • Answer 2: \( \langle \mathbf{a}, \mathbf{b} \rangle = (1)(3) + (2)(4) = 11 \).
  • Answer 3: \( \langle \mathbf{x}, \mathbf{y} \rangle = (1)(-2) + (2)(1) = 0 \), so they are orthogonal.

This Week's Best Picks from Amazon

Please see more curated items that we picked from Amazon here .

Popular posts from this blog

Gaussian Elimination: A Step-by-Step Guide

Gaussian Elimination: A Step-by-Step Guide Gaussian Elimination is a systematic method for solving systems of linear equations. It works by transforming a given system into an equivalent one in row echelon form using a sequence of row operations. Once in this form, the system can be solved efficiently using back-substitution . What is Gaussian Elimination? Gaussian elimination consists of two main stages: Forward Elimination: Convert the system into an upper triangular form. Back-Substitution: Solve for unknowns starting from the last equation. Definition of a Pivot A pivot is the first nonzero entry in a row when moving from left to right. Pivots are used to eliminate the elements below them, transforming the system into an upper triangular form. Step-by-Step Example Consider the system of equations: \[ \begin{aligned} 2x + 3y - z &= 5 \\ 4x + y...

Singular Value Decomposition

Lesson Objectives ▼ Understand the concept of Singular Value Decomposition (SVD). Decompose a matrix into orthogonal matrices and singular values. Learn how to compute SVD step by step. Explore applications of SVD in data compression and machine learning. Lesson Outline ▼ Definition of SVD Computing SVD Step by Step Applications of SVD Examples Definition of Singular Value Decomposition The **Singular Value Decomposition (SVD)** of an \( m \times n \) matrix \( A \) is a factorization of the form: \[ A = U \Sigma V^T \] where: \( U \) is an \( m \times m \) orthogonal matrix (left singular vectors). \( \Sigma \) is an \( m \times n \) diagonal matrix with **singular values** on the diagonal. \( V \) is an \( n \times n \) orthogonal matrix (right singular vectors). Computing SVD Step by Step To compute \( A = U \Sigma V^T \): Find the eigenvalues and eige...

LU Decomposition

LU Decomposition: A Step-by-Step Guide LU Decomposition, also known as LU Factorization, is a method of decomposing a square matrix into two triangular matrices: a lower triangular matrix L and an upper triangular matrix U . This is useful for solving linear equations, computing determinants, and inverting matrices efficiently. What is LU Decomposition? LU Decomposition expresses a matrix A as: \[ A = LU \] where: L is a lower triangular matrix with ones on the diagonal. U is an upper triangular matrix. Step-by-Step Process Consider the matrix: \[ A = \begin{bmatrix} 2 & 3 & 1 \\ 4 & 7 & 3 \\ 6 & 18 & 5 \end{bmatrix} \] Step 1: Initialize L as an Identity Matrix Start with an identity matrix for \( L \): \[ L = \begin{bmatrix} 1 & 0 & 0 \\ 0 ...