stfwn
  • Introduction
  • Calculus
    • Gradient
    • Taylor Series
    • Miscellaneous
  • Computer Science
    • Basic Architecture and Assembly
  • Fixes & Snippets
    • Docker
    • HTML
    • LaTeX
    • Linux
    • Node.js
    • Miscellaneous
  • Linear Algebra
    • Orthogonality
    • The Gram-Schmidt Process
    • QR-Factorization
    • Orthogonal Diagonalization
    • Quadratic Forms
    • Least Squares Theorem
    • Singular Value Decomposition
    • Pseudoinverse
  • Miscellaneous
    • Digital Safety
    • Homelab
    • Links
    • Music
  • Reading
Powered by GitBook
On this page
  1. Linear Algebra

Least Squares Theorem

ATAxˉ=ATborxˉ=(ATA)−1ATbA^T A \bar{x} = A^T b\\ \text{or}\\ \bar{x} = (A^T A)^{-1} A^T bATAxˉ=ATborxˉ=(ATA)−1ATb

The idea is to find some vector xˉ\bar{x}xˉ for a given matrix AAA and b⃗\vec{b}b that satisfies Axˉ=bA\bar{x} = bAxˉ=b. But that system is in fact inconsistent; b−Axˉ≠0b - A\bar{x} \neq 0b−Axˉ=0. So we look for a solution for xˉ\bar{x}xˉ that comes as close as possible. The result of b−Axˉ≠0b - A\bar{x} \neq 0b−Axˉ=0 will be some error e⃗\vec{e}e. We can take the length of this error vector to express the accuracy of our solution. Optimally, we find the solution that is as close to 000 as possible; the least squares solution.

PreviousQuadratic FormsNextSingular Value Decomposition

Last updated 6 years ago