Describe how projections play a role in the least squares solution?
Answers
Answered by
1
In two dimensions we have a system of two linear equations like :
2 x1 + 3 x2 = 5
3 x1 - 4 x2 = -1
Then we can solve for an exact solution, if it exists, by usual methods or by matrix method as :
![A X = b\\\\where\ A= \left[\begin{array}{cc}2&3\\3&-4\end{array}\right] \\\\X= \left[\begin{array}{c}x1&x2\end{array}\right],\ \ b= \left[\begin{array}{c}5&-1\end{array}\right]\\\\To\ solve\ for\ X,\ we\ do\ as\ \ X=A^{-1}b\\ A X = b\\\\where\ A= \left[\begin{array}{cc}2&3\\3&-4\end{array}\right] \\\\X= \left[\begin{array}{c}x1&x2\end{array}\right],\ \ b= \left[\begin{array}{c}5&-1\end{array}\right]\\\\To\ solve\ for\ X,\ we\ do\ as\ \ X=A^{-1}b\\](https://tex.z-dn.net/?f=A+X+%3D+b%5C%5C%5C%5Cwhere%5C+A%3D+%5Cleft%5B%5Cbegin%7Barray%7D%7Bcc%7D2%26amp%3B3%5C%5C3%26amp%3B-4%5Cend%7Barray%7D%5Cright%5D+%5C%5C%5C%5CX%3D+%5Cleft%5B%5Cbegin%7Barray%7D%7Bc%7Dx1%26amp%3Bx2%5Cend%7Barray%7D%5Cright%5D%2C%5C+%5C+b%3D+%5Cleft%5B%5Cbegin%7Barray%7D%7Bc%7D5%26amp%3B-1%5Cend%7Barray%7D%5Cright%5D%5C%5C%5C%5CTo%5C+solve%5C+for%5C+X%2C%5C+we%5C+do%5C+as%5C+%5C+X%3DA%5E%7B-1%7Db%5C%5C)
In reality we have a number of points in two dimensions and we have to fit a curve among them, which best fits the given data. No solution exists so that A X = b is satisfied exactly. This is the same for 4 or more dimensions and when the matrix is rectangular and has m x n dimensions with a rank n. We cannot find A inverse in that case.
In such cases, we find the best fit, by taking projections from each data point on to the best fit and minimize the sum of squares of the projections from points on to the solution we give.
This is the least squares problem. We find the nearest solution to the data given.
In n-dimensional space: A X = b A is rectangular = m x n size and rank n.
Column vector space of A = W (Let)
The projection of vector b on to W = P = A X gives the solution for X
The vector Orthogonal to the projection P
= n-dimensional distance (projection) from data to the n-dimensional plane W
= b - A X .
Then from orthogonality of vectors :

Thus the unique solution of an n-dimensional least squares problem is given by the above matrix equation. Here it can be proved that (A^TA) is non-singular and has an inverse.
2 x1 + 3 x2 = 5
3 x1 - 4 x2 = -1
Then we can solve for an exact solution, if it exists, by usual methods or by matrix method as :
In reality we have a number of points in two dimensions and we have to fit a curve among them, which best fits the given data. No solution exists so that A X = b is satisfied exactly. This is the same for 4 or more dimensions and when the matrix is rectangular and has m x n dimensions with a rank n. We cannot find A inverse in that case.
In such cases, we find the best fit, by taking projections from each data point on to the best fit and minimize the sum of squares of the projections from points on to the solution we give.
This is the least squares problem. We find the nearest solution to the data given.
In n-dimensional space: A X = b A is rectangular = m x n size and rank n.
Column vector space of A = W (Let)
The projection of vector b on to W = P = A X gives the solution for X
The vector Orthogonal to the projection P
= n-dimensional distance (projection) from data to the n-dimensional plane W
= b - A X .
Then from orthogonality of vectors :
Thus the unique solution of an n-dimensional least squares problem is given by the above matrix equation. Here it can be proved that (A^TA) is non-singular and has an inverse.
kvnmurty:
thanx n u r welcom
Similar questions