- Understand the definition of inner product spaces.
- Compute norms, angles, and distances in inner product spaces.
- Explore the concept of orthogonality.
- Learn the Gram-Schmidt orthogonalization process.
- Use inner products for projections and least squares approximation.
Definition of Inner Product Spaces
An inner product space is a vector space with an operation called the **inner product**, which measures similarity between vectors.
For a real vector space, the inner product of two vectors \( \mathbf{u}, \mathbf{v} \) is:
\[ \langle \mathbf{u}, \mathbf{v} \rangle = u_1 v_1 + u_2 v_2 + \dots + u_n v_n \]For a complex vector space, the inner product is:
\[ \langle \mathbf{u}, \mathbf{v} \rangle = u_1 \overline{v_1} + u_2 \overline{v_2} + \dots + u_n \overline{v_n} \]Norms, Angles, and Orthogonality
The **norm** (or length) of a vector is:
\[ \|\mathbf{v}\| = \sqrt{\langle \mathbf{v}, \mathbf{v} \rangle} \]The **angle** between two vectors is given by:
\[ \cos\theta = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\|\mathbf{u}\| \|\mathbf{v}\|} \]Two vectors are **orthogonal** if:
\[ \langle \mathbf{u}, \mathbf{v} \rangle = 0 \]Gram-Schmidt Orthogonalization
The **Gram-Schmidt process** converts a set of linearly independent vectors into an orthonormal set.
- Start with linearly independent vectors \( \mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n \).
- Compute an orthogonal basis \( \mathbf{q}_1, \mathbf{q}_2, \dots, \mathbf{q}_n \) using:
Proof
The Gram-Schmidt process constructs orthonormal vectors by removing the components of earlier vectors.
Starting with \( \mathbf{q}_1 = \frac{\mathbf{v}_1}{\|\mathbf{v}_1\|} \), we compute \( \mathbf{q}_2 \):
\[ \mathbf{q}_2 = \mathbf{v}_2 - \frac{\langle \mathbf{q}_1, \mathbf{v}_2 \rangle}{\langle \mathbf{q}_1, \mathbf{q}_1 \rangle} \mathbf{q}_1 \]This ensures \( \langle \mathbf{q}_1, \mathbf{q}_2 \rangle = 0 \), making them orthogonal.
Projections and Least Squares
The **projection** of a vector \( \mathbf{v} \) onto \( \mathbf{u} \) is:
\[ \text{proj}_{\mathbf{u}} \mathbf{v} = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\langle \mathbf{u}, \mathbf{u} \rangle} \mathbf{u} \]The **least squares solution** to \( A\mathbf{x} = \mathbf{b} \) minimizes:
\[ \| A\mathbf{x} - \mathbf{b} \| \]By projecting \( \mathbf{b} \) onto the column space of \( A \):
\[ A^T A \mathbf{x} = A^T \mathbf{b} \]Examples
Example 1: Compute the inner product of \( \mathbf{u} = (1,2,3) \) and \( \mathbf{v} = (4,-1,2) \).
\[ \langle \mathbf{u}, \mathbf{v} \rangle = (1)(4) + (2)(-1) + (3)(2) = 4 - 2 + 6 = 8 \]Exercises
- Question 1: Find the norm of \( \mathbf{v} = (3,4) \).
- Question 2: Compute the inner product of \( \mathbf{a} = (1,2) \) and \( \mathbf{b} = (3,4) \).
- Question 3: Check if \( \mathbf{x} = (1,2) \) and \( \mathbf{y} = (-2,1) \) are orthogonal.
- Answer 1: \( \|\mathbf{v}\| = \sqrt{3^2 + 4^2} = 5 \).
- Answer 2: \( \langle \mathbf{a}, \mathbf{b} \rangle = (1)(3) + (2)(4) = 11 \).
- Answer 3: \( \langle \mathbf{x}, \mathbf{y} \rangle = (1)(-2) + (2)(1) = 0 \), so they are orthogonal.