stfwn
  • Introduction
  • Calculus
    • Gradient
    • Taylor Series
    • Miscellaneous
  • Computer Science
    • Basic Architecture and Assembly
  • Fixes & Snippets
    • Docker
    • HTML
    • LaTeX
    • Linux
    • Node.js
    • Miscellaneous
  • Linear Algebra
    • Orthogonality
    • The Gram-Schmidt Process
    • QR-Factorization
    • Orthogonal Diagonalization
    • Quadratic Forms
    • Least Squares Theorem
    • Singular Value Decomposition
    • Pseudoinverse
  • Miscellaneous
    • Digital Safety
    • Homelab
    • Links
    • Music
  • Reading
Powered by GitBook
On this page
  1. Linear Algebra

The Gram-Schmidt Process

PreviousOrthogonalityNextQR-Factorization

Last updated 6 years ago

Orthogonal matrices are useful for many things, and it would be nice to have a strategy to construct one for any subspace. This tool exists and is called the Gram-Schmidt Process. You start with an arbitrary basis for the subspace and orthogonalize it one vector at a time.

Let {x1,...,xn}\{x_1, ..., x_n\}{x1​,...,xn​} be a basis for a subspace WWW of RnR^nRn. To construct an orthogonal basis for this subspace:

v1=x1v2=x2−(v1⋅x2v1⋅v1)v1v3=x3−(v1⋅x3v1⋅v1)v1−(v2⋅x3v2⋅v2)v2...v_1 = x_1\\ v_2 = x_2 - (\frac{v_1 \cdot x_2}{v_1 \cdot v_1})v_1\\ v_3 = x_3 - (\frac{v_1 \cdot x_3}{v_1 \cdot v_1})v_1 - (\frac{v_2 \cdot x_3}{v_2 \cdot v_2})v_2\\ ...v1​=x1​v2​=x2​−(v1​⋅v1​v1​⋅x2​​)v1​v3​=x3​−(v1​⋅v1​v1​⋅x3​​)v1​−(v2​⋅v2​v2​⋅x3​​)v2​...

This makes sense because you 'remove' the part of the 'movement' of each next vector that is already covered by the existing basis, thereby orthogonalizing them. To obtain an orthonormal basis, simply normalize the vectors obtained by the Gram-Schmidt process earlier.