This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
gibson:teaching:fall-2016:math753:qr-leastsquares [2016/10/12 09:02] gibson created |
gibson:teaching:fall-2016:math753:qr-leastsquares [2016/10/12 09:18] (current) gibson |
||
---|---|---|---|
Line 1: | Line 1: | ||
====== Math 753/853 QR and the least-squares problem ====== | ====== Math 753/853 QR and the least-squares problem ====== | ||
- | The QR decomposition is useful for solving the //linear least-squares problem//. Briefly, suppose you have an $Ax=b$ system with an oblong matrix $A$, i.e. A is an $m \time n$ matrix with $n<m$. | + | The QR decomposition is useful for solving the //linear least-squares problem//. Briefly, suppose you have an $Ax=b$ system with an oblong matrix $A$, i.e. A is an $m \times n$ matrix with $n<m$. |
Each of the $m$ rows of $A$ corresponds to a linear equation in the unknown $n$ variables that are the components of $x$. But with $n<m$, that means we have more equations than unknowns. In general, a system with more equations than unknowns does not have a solution! | Each of the $m$ rows of $A$ corresponds to a linear equation in the unknown $n$ variables that are the components of $x$. But with $n<m$, that means we have more equations than unknowns. In general, a system with more equations than unknowns does not have a solution! | ||
- | So, instead of looking for an $x$ such that $Ax=b$, we look for an $x$ such that $Ax-b$ is small. | + | So, instead of looking for an $x$ such that $Ax=b$, we look for an $x$ such that $Ax-b$ is small. Since $Ax-b$ is a vector, we measure its size with a norm. That means we are looking for the $x$ that minimizes $\|Ax-b\|$. |
- | Since $Ax-b$ is a vector, we measure its size with a norm. That means we are looking for the $x$ that minimizes $\|Ax-b\|$. | + | |
+ | Let $r = Ax-b$. Thinking geometrically, $\|r\| = \|Ax-b\|$ will be smallest when $r$ is orthogonal to the span of the columns of $A$. Recall that for the QR factorization $A=QR$, the columns of $Q$ are an orthonormal basis for the span of the columns of $A$. So $r$ is orthogonal to the span of the columns of $A$ when | ||
+ | \begin{equation*} | ||
+ | Q^T r = 0 | ||
+ | \end{equation*} | ||
+ | \begin{equation*} | ||
+ | Q^T (Ax-b) = 0 | ||
+ | \end{equation*} | ||
+ | \begin{equation*} | ||
+ | Q^T A x = Q^T b | ||
+ | \end{equation*} | ||
+ | \begin{equation*} | ||
+ | Q^T Q R x = Q^T b | ||
+ | \end{equation*} | ||
+ | \begin{equation*} | ||
+ | R x = Q^T b | ||
+ | \end{equation*} | ||
- | http://www.math.utah.edu/~pa/6610/20130927.pdf | + | Since $Q$ is an $m \times n$ matrix whose columns are orthonormal, $Q^T Q = I$ is the $n \times n$ identity matrix. Since $R$ is $n\times n$, we now have as the same number of equations as unknowns. The last line, $Rx = Q^Tb$, is a square upper-triangular system which we can solve by backsubstitution. |
+ | |||
+ | That's a quick recap of least-squares via QR. For more detail, see these | ||
+ | [[http://www.math.utah.edu/~pa/6610/20130927.pdf|lecture notes]] from the University of Utah. |