User Tools

Site Tools


gibson:teaching:fall-2016:math753:qr-leastsquares

This is an old revision of the document!


A PCRE internal error occured. This might be caused by a faulty plugin

====== Math 753/853 QR and the least-squares problem ====== The QR decomposition is useful for solving the //linear least-squares problem//. Briefly, suppose you have an $Ax=b$ system with an oblong matrix $A$, i.e. A is an $m \times n$ matrix with $n<m$. Each of the $m$ rows of $A$ corresponds to a linear equation in the unknown $n$ variables that are the components of $x$. But with $n<m$, that means we have more equations than unknowns. In general, a system with more equations than unknowns does not have a solution! So, instead of looking for an $x$ such that $Ax=b$, we look for an $x$ such that $Ax-b$ is small. Since $Ax-b$ is a vector, we measure its size with a norm. That means we are looking for the $x$ that minimizes $\|Ax-b\|$. Let $r = Ax-b$. Thinking geometrically, $\|r\| = \|Ax-b\|$ will be smallest when $r$ is orthogonal to the span of the columns of $A$. Recall that for the QR factorization $A=QR$, the columns of $Q$ are an orthonormal basis for the span of the columns of $A$. So $r$ is orthogonal to the span of the columns of $A$ when \begin{align*} Q^T r = 0\\ Q^T (Ax-b) = 0\\ Q^T A x = Q^T b \\ Q^T Q R x = Q^T b \\ R x = Q^T b \end{align*} Since $Q$ is an $m \times n$ matrix whose columns are orthonormal, $Q^T Q = I$ is the $n \times n$ identity matrix. Since $R$ is $n\times n$, we now have as the same number of equations as unknowns. The last line, $Rx = Q^Tb$, is a square upper-triangular system which we can solve by backsubstitution. That's a quick recap of least-squares via QR. For more detail, see these [[http://www.math.utah.edu/~pa/6610/20130927.pdf|lecture notes]] from the University of Utah.

gibson/teaching/fall-2016/math753/qr-leastsquares.1476289016.txt.gz · Last modified: 2016/10/12 09:16 by gibson