xref: /aosp_15_r20/external/eigen/doc/LeastSquares.dox (revision bf2c37156dfe67e5dfebd6d394bad8b2ab5804d4)
1namespace Eigen {
2
3/** \eigenManualPage LeastSquares Solving linear least squares systems
4
5This page describes how to solve linear least squares systems using %Eigen. An overdetermined system
6of equations, say \a Ax = \a b, has no solutions. In this case, it makes sense to search for the
7vector \a x which is closest to being a solution, in the sense that the difference \a Ax - \a b is
8as small as possible. This \a x is called the least square solution (if the Euclidean norm is used).
9
10The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal
11equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal
12equations is the fastest but least accurate, and the QR decomposition is in between.
13
14\eigenAutoToc
15
16
17\section LeastSquaresSVD Using the SVD decomposition
18
19The \link BDCSVD::solve() solve() \endlink method in the BDCSVD class can be directly used to
20solve linear squares systems. It is not enough to compute only the singular values (the default for
21this class); you also need the singular vectors but the thin SVD decomposition suffices for
22computing least squares solutions:
23
24<table class="example">
25<tr><th>Example:</th><th>Output:</th></tr>
26<tr>
27  <td>\include TutorialLinAlgSVDSolve.cpp </td>
28  <td>\verbinclude TutorialLinAlgSVDSolve.out </td>
29</tr>
30</table>
31
32This is example from the page \link TutorialLinearAlgebra Linear algebra and decompositions \endlink.
33If you just need to solve the least squares problem, but are not interested in the SVD per se, a
34faster alternative method is CompleteOrthogonalDecomposition.
35
36
37\section LeastSquaresQR Using the QR decomposition
38
39The solve() method in QR decomposition classes also computes the least squares solution. There are
40three QR decomposition classes: HouseholderQR (no pivoting, fast but unstable if your matrix is
41not rull rank), ColPivHouseholderQR (column pivoting, thus a bit slower but more stable) and
42FullPivHouseholderQR (full pivoting, so slowest and slightly more stable than ColPivHouseholderQR).
43Here is an example with column pivoting:
44
45<table class="example">
46<tr><th>Example:</th><th>Output:</th></tr>
47<tr>
48  <td>\include LeastSquaresQR.cpp </td>
49  <td>\verbinclude LeastSquaresQR.out </td>
50</tr>
51</table>
52
53
54\section LeastSquaresNormalEquations Using normal equations
55
56Finding the least squares solution of \a Ax = \a b is equivalent to solving the normal equation
57<i>A</i><sup>T</sup><i>Ax</i> = <i>A</i><sup>T</sup><i>b</i>. This leads to the following code
58
59<table class="example">
60<tr><th>Example:</th><th>Output:</th></tr>
61<tr>
62  <td>\include LeastSquaresNormalEquations.cpp </td>
63  <td>\verbinclude LeastSquaresNormalEquations.out </td>
64</tr>
65</table>
66
67This method is usually the fastest, especially when \a A is "tall and skinny". However, if the
68matrix \a A is even mildly ill-conditioned, this is not a good method, because the condition number
69of <i>A</i><sup>T</sup><i>A</i> is the square of the condition number of \a A. This means that you
70lose roughly twice as many digits of accuracy using the normal equation, compared to the more stable
71methods mentioned above.
72
73*/
74
75}