User Tools

Site Tools


gibson:teaching:fall-2016:math753:qr-leastsquares

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
gibson:teaching:fall-2016:math753:qr-leastsquares [2016/10/12 09:12]
gibson
gibson:teaching:fall-2016:math753:qr-leastsquares [2016/10/12 09:18] (current)
gibson
Line 7: Line 7:
 So, instead of looking for an $x$ such that $Ax=b$, we look for an $x$ such that $Ax-b$ is small. Since $Ax-b$ is a vector, we measure its size with a norm. That means we are looking for the $x$ that minimizes $\|Ax-b\|$. ​ So, instead of looking for an $x$ such that $Ax=b$, we look for an $x$ such that $Ax-b$ is small. Since $Ax-b$ is a vector, we measure its size with a norm. That means we are looking for the $x$ that minimizes $\|Ax-b\|$. ​
  
-Let $r = Ax-b$. $\|r\| = \|Ax-b\| will be smallest when $r$ is orthogonal to the span of the columns of $A$. Recall that for the QR factorization $A=QR$, the columns of $Q$ are an orthonormal basis for the span of the columns of $A$. So $r$ is orthogonal to the span of the columns of $A$ when +Let $r = Ax-b$. ​Thinking geometrically, ​$\|r\| = \|Ax-b\|will be smallest when $r$ is orthogonal to the span of the columns of $A$. Recall that for the QR factorization $A=QR$, the columns of $Q$ are an orthonormal basis for the span of the columns of $A$. So $r$ is orthogonal to the span of the columns of $A$ when 
  
-\begin{align*} +\begin{equation*} 
-Q^T r &= 0\\ +Q^T r = 0 
-Q^T (Ax-b) ​&= 0\\ +\end{equation*} 
-Q^T A x &= Q^T b \\ +\begin{equation*} 
-Q^T Q R x = Q^T b \\+Q^T (Ax-b) = 0 
 +\end{equation*} 
 +\begin{equation*} 
 +Q^T A x = Q^T b  
 +\end{equation*} 
 +\begin{equation*} 
 +Q^T Q R x = Q^T b 
 +\end{equation*} 
 +\begin{equation*}
 R x = Q^T b R x = Q^T b
-\end{align*}+\end{equation*}
  
 Since $Q$ is an $m \times n$ matrix whose columns are orthonormal,​ $Q^T Q = I$ is the $n \times n$ identity matrix. Since $R$ is $n\times n$, we now have as the same number of equations as unknowns. The last line, $Rx = Q^Tb$, is a square upper-triangular system which we can solve by backsubstitution. Since $Q$ is an $m \times n$ matrix whose columns are orthonormal,​ $Q^T Q = I$ is the $n \times n$ identity matrix. Since $R$ is $n\times n$, we now have as the same number of equations as unknowns. The last line, $Rx = Q^Tb$, is a square upper-triangular system which we can solve by backsubstitution.
  
- +That's a quick recap of least-squares via QR. For more detail, see these 
-http://​www.math.utah.edu/​~pa/​6610/​20130927.pdf+[[http://​www.math.utah.edu/​~pa/​6610/​20130927.pdf|lecture notes]] from the University of Utah.
gibson/teaching/fall-2016/math753/qr-leastsquares.1476288776.txt.gz · Last modified: 2016/10/12 09:12 by gibson