User talk:Wallers

Welcome!

Hello,, and welcome to Wikipedia! Thank you for your contributions. I hope you like the place and decide to stay. Here are some pages that you might find helpful: I hope you enjoy editing here and being a Wikipedian! Please sign your name on talk pages using four tildes ~, which will automatically produce your name and the date.
 * Introduction
 * The five pillars of Wikipedia
 * How to edit a page
 * Help pages
 * How to write a great article
 * Manual of Style

If you need help, check out Questions, ask me on, or place  on your talk page and ask your question there. Again, welcome!

Considering changes to numerical methods for linear least squares page
I dislike the discussion of the QR decomposition to solve the linear least squares problem. I much prefer Susan Blackford's discussion on the Lapack pages: http://netlib.org/lapack/lug/node40.html. I'm considering changing it to something like this, in line with her discussion.

The QR decomposition allows us to decompose the matrix X into


 * $$X = QR $$

where $$Q$$ is an $$m \times m$$ orthogonal matrix and $$R$$ is an $$m \times n$$ upper triangular matrix. Because $$X$$ is overdetermined, $$m > n$$, and we have a special case of the QR decomposition where

$$X = \begin{bmatrix} Q_1, Q_2 \end{bmatrix} \begin{bmatrix} R_n \\ \mathbf{0} \end{bmatrix}$$.

The goal of the linear least squares solution is to find $$\mathbf{\beta}$$ which minimizes $$||\mathbf{y} - X \mathbf{\beta}||_2$$. Multiplying by an orthogonal matrix does not alter the L2 norm, so

$$\begin{align} ||\mathbf{y} - X \mathbf{\beta}||_2 &= ||Q^T\mathbf{y} - Q^T X \mathbf{\beta}||_2 \\ &= \left|\left|\begin{bmatrix} Q_1^T \mathbf{y} - R_n \mathbf{\beta} \\ Q_2^T \mathbf{y} \end{bmatrix}\right|\right|_2 \end{align}$$

The upper portion can be solved for $$ \mathbf{\beta} $$

$$ \mathbf{\beta} = R_n^{-1} Q_1^T \mathbf{y} $$,

allowing us to calculate the predicted values

$$\hat{\mathbf{y}} = X \mathbf{\beta}$$.

The residual sum of squares can be either calculated by taking the difference of the predicted and actual values

$$||\mathbf{r}||_2^2 = ||\mathbf{y} - \hat{\mathbf{y}}||_2^2 $$

or by realizing the residual sum of squares is equal to the portion which was independent of $$\mathbf{\beta}$$

$$\begin{align}||\mathbf{r}||_2^2 &= ||\mathbf{y} - X \mathbf{\beta}||_2^2 \\ &= \left|\left| \begin{bmatrix}\mathbf{0} \\ Q_2^T \mathbf{y} \end{bmatrix}\right|\right|_2^2 \\ &= ||Q_2^T \mathbf{y}||_2^2. \end{align}$$

Wallers (talk) 15:49, 17 June 2010 (UTC)