Find the least squares solution for rank deficient system
Find the least squares solution to the system
$$x - y = 4$$
$$x - y = 6$$
Normally if I knew what the matrix $A$ was and what $b$ was I could just do $(A^TA)^{-1} A^Tb$, but in this case I'm not sure how to set up my matrices. How can I find the least square solution to the system?
linear-algebra systems-of-equations least-squares
add a comment |
Find the least squares solution to the system
$$x - y = 4$$
$$x - y = 6$$
Normally if I knew what the matrix $A$ was and what $b$ was I could just do $(A^TA)^{-1} A^Tb$, but in this case I'm not sure how to set up my matrices. How can I find the least square solution to the system?
linear-algebra systems-of-equations least-squares
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10
add a comment |
Find the least squares solution to the system
$$x - y = 4$$
$$x - y = 6$$
Normally if I knew what the matrix $A$ was and what $b$ was I could just do $(A^TA)^{-1} A^Tb$, but in this case I'm not sure how to set up my matrices. How can I find the least square solution to the system?
linear-algebra systems-of-equations least-squares
Find the least squares solution to the system
$$x - y = 4$$
$$x - y = 6$$
Normally if I knew what the matrix $A$ was and what $b$ was I could just do $(A^TA)^{-1} A^Tb$, but in this case I'm not sure how to set up my matrices. How can I find the least square solution to the system?
linear-algebra systems-of-equations least-squares
linear-algebra systems-of-equations least-squares
edited Apr 2 '17 at 2:55
dantopa
6,41932042
6,41932042
asked Aug 1 '16 at 22:53
Yusha
79432444
79432444
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10
add a comment |
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10
add a comment |
6 Answers
6
active
oldest
votes
We have the linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
which can be rewritten in the form
$$begin{bmatrix} 1\ 1end{bmatrix} eta = begin{bmatrix} 4 \ 6end{bmatrix}$$
where $eta = x - y$. Left-multiplying both sides by $begin{bmatrix} 1\ 1end{bmatrix}^T$, we obtain $2 eta = 10$, or, $eta = 5$. Hence,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
add a comment |
First, choose points (x,y) that satisfy each equation.
$begin{cases}
x - y = 4, & text{(6,2)} \
x - y = 6, & text{(10,4)}
end{cases}$
Then, proceed as usual
$Ax = begin{bmatrix}
1 & 6 \
1 & 10 \
end{bmatrix} begin{bmatrix}
b \
m \
end{bmatrix} = begin{bmatrix}
2 \
4 \
end{bmatrix}$
$begin{bmatrix}
b \
m \
end{bmatrix} =begin{bmatrix}
5 \
1/2 \
end{bmatrix}$
$y = 1/2x + 5$
add a comment |
Problem statement: underdetermined system
Start with the linear system
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & -1 \
1 & -1 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
4 \
6
end{array}
right]
%
end{align}
$$
The system has matrix rank $rho = 1$; therefore, if a solution exists, it will not be unique.
Provided $bnotin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$, we are guaranteed a least squares solution
$$
x_{LS} = left{ xinmathbb{C}^{2} colon lVert mathbf{A} x_{LS} - b rVert_{2}^{2} text{ is minimized} right}
tag{1}
$$
Subspace resolution
By inspection, we see that the row space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A}^{*} right)} oplus
color{red}{mathcal{N} left( mathbf{A} right)}
=
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]} oplus
color{red}{left[
begin{array}{c}
1 \
1
end{array}
right]}
$$
The column space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A} right)} oplus
color{red}{mathcal{N} left( mathbf{A}^{*} right)}
=
color{blue}{left[
begin{array}{c}
1 \
1
end{array}
right]} oplus
color{red}{left[
begin{array}{r}
-1 \
1
end{array}
right]}
$$
The coloring indicates vectors in the $color{blue}{range}$ space and the $color{red}{null}$ space.
Finding the least squares solution
Since there is only one vector in $color{blue}{mathcal{R} left( mathbf{A}^{*} right)}$, the solution vector will have the form
$$
color{blue}{x_{LS}} = alpha
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
$$
The goal is to find the constant $alpha$ to minimize (1):
$$
color{red}{r}^{2} = color{red}{r} cdot color{red}{r} =
lVert
color{blue}{mathbf{A} x_{LS}} - b
rVert_{2}^{2}
=
8 alpha ^2-40 alpha +52
$$
The minimum of the polynomial is at
$$
alpha = frac{5}{2}
$$
Least squares solution
The set of least squares minimizers in (1) is then the affine set given by
$$
x_{LS} = frac{5}{2}
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
+
xi
color{red}{left[
begin{array}{r}
1 \
1
end{array}
right]}, qquad xiinmathbb{C}
$$
The plot below shows how the total error $lVert mathbf{A} x_{LS} - b rVert_{2}^{2}$ varies with the fit parameters. The blue dot is the particular solution, the dashed line homogeneous solution as well as the $0$ contour - the exact solution.
Addendum: Existence of the Least Squares Solution
To address the insightful question of @RodrigodeAzevedo, consider the linear system:
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
0 \
1
end{array}
right]
%
end{align}
$$
The data vector $b$ is entirely in the null space of $mathbf{A}^{*}$:
$bin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$
As pointed out, the system matrix has the singular value decomposition. One instance is:
$$mathbf{A} = mathbf{U}, Sigma, mathbf{V}^{*} = mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2}$$
and the concomitant pseudoinverse,
$$mathbf{A}^{dagger} = mathbf{V}, Sigma^{dagger} mathbf{U}^{*} =
mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2} = mathbf{A}$$
Following least squares canon, the particular solution to the least squares problem is computed as
$$
color{blue}{x_{LS}} = mathbf{A}^{dagger} b =
color{red}{left[
begin{array}{c}
0 \
0 \
end{array}
right]}
qquad RightarrowLeftarrow
$$
The color collision (null space [red] = range space [blue]) indicates a problem. There is no component of a particular solution vector in a range space!
Mathematicians habitually exclude the $0$ vector a solution to linear problems.
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
add a comment |
$$A = left [begin{array}{cc}
1 & -1 \
1 & -1 \
end{array} right],
quad b = begin{bmatrix}4\6 end{bmatrix} $$
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
add a comment |
The linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
has no solution. Left-multiplying both sides by $begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix}^T$, we obtain the normal equations
$$begin{bmatrix} 2 & -2\ -2 & 2end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 10 \ -10end{bmatrix}$$
Dividing both sides by $2$ and removing the redundant equation,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
The least-squares solution is a solution to the normal equations, not to the original linear system.
add a comment |
Your matrix is just the coefficients of your system of equations. In this case
$$ x-y = 4 $$
$$ x-y = 6 $$
leads to
$$
begin{bmatrix}
1 & -1 \ 1 & -1
end{bmatrix}
begin{bmatrix}
x \ y
end{bmatrix}
=
begin{bmatrix}
4 \ 6
end{bmatrix}
$$
but you should see that there is no solution to this since you can't have $x-y$ be both $4$ and $6$...
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
|
show 6 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1878383%2ffind-the-least-squares-solution-for-rank-deficient-system%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
We have the linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
which can be rewritten in the form
$$begin{bmatrix} 1\ 1end{bmatrix} eta = begin{bmatrix} 4 \ 6end{bmatrix}$$
where $eta = x - y$. Left-multiplying both sides by $begin{bmatrix} 1\ 1end{bmatrix}^T$, we obtain $2 eta = 10$, or, $eta = 5$. Hence,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
add a comment |
We have the linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
which can be rewritten in the form
$$begin{bmatrix} 1\ 1end{bmatrix} eta = begin{bmatrix} 4 \ 6end{bmatrix}$$
where $eta = x - y$. Left-multiplying both sides by $begin{bmatrix} 1\ 1end{bmatrix}^T$, we obtain $2 eta = 10$, or, $eta = 5$. Hence,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
add a comment |
We have the linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
which can be rewritten in the form
$$begin{bmatrix} 1\ 1end{bmatrix} eta = begin{bmatrix} 4 \ 6end{bmatrix}$$
where $eta = x - y$. Left-multiplying both sides by $begin{bmatrix} 1\ 1end{bmatrix}^T$, we obtain $2 eta = 10$, or, $eta = 5$. Hence,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
We have the linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
which can be rewritten in the form
$$begin{bmatrix} 1\ 1end{bmatrix} eta = begin{bmatrix} 4 \ 6end{bmatrix}$$
where $eta = x - y$. Left-multiplying both sides by $begin{bmatrix} 1\ 1end{bmatrix}^T$, we obtain $2 eta = 10$, or, $eta = 5$. Hence,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
answered Aug 2 '16 at 0:59
Rodrigo de Azevedo
12.8k41855
12.8k41855
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
add a comment |
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
1
1
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
@RodrigodeAzevedo Haha I think it might be. My up-vote will bring you back up to zero ;p
– Carser
Aug 2 '16 at 11:48
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
Why do you have two nearly identical answers to this question?
– Morgan Rodgers
Mar 14 '17 at 3:56
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
@MorganRodgers Because one is better than the other.
– Rodrigo de Azevedo
Mar 14 '17 at 7:48
add a comment |
First, choose points (x,y) that satisfy each equation.
$begin{cases}
x - y = 4, & text{(6,2)} \
x - y = 6, & text{(10,4)}
end{cases}$
Then, proceed as usual
$Ax = begin{bmatrix}
1 & 6 \
1 & 10 \
end{bmatrix} begin{bmatrix}
b \
m \
end{bmatrix} = begin{bmatrix}
2 \
4 \
end{bmatrix}$
$begin{bmatrix}
b \
m \
end{bmatrix} =begin{bmatrix}
5 \
1/2 \
end{bmatrix}$
$y = 1/2x + 5$
add a comment |
First, choose points (x,y) that satisfy each equation.
$begin{cases}
x - y = 4, & text{(6,2)} \
x - y = 6, & text{(10,4)}
end{cases}$
Then, proceed as usual
$Ax = begin{bmatrix}
1 & 6 \
1 & 10 \
end{bmatrix} begin{bmatrix}
b \
m \
end{bmatrix} = begin{bmatrix}
2 \
4 \
end{bmatrix}$
$begin{bmatrix}
b \
m \
end{bmatrix} =begin{bmatrix}
5 \
1/2 \
end{bmatrix}$
$y = 1/2x + 5$
add a comment |
First, choose points (x,y) that satisfy each equation.
$begin{cases}
x - y = 4, & text{(6,2)} \
x - y = 6, & text{(10,4)}
end{cases}$
Then, proceed as usual
$Ax = begin{bmatrix}
1 & 6 \
1 & 10 \
end{bmatrix} begin{bmatrix}
b \
m \
end{bmatrix} = begin{bmatrix}
2 \
4 \
end{bmatrix}$
$begin{bmatrix}
b \
m \
end{bmatrix} =begin{bmatrix}
5 \
1/2 \
end{bmatrix}$
$y = 1/2x + 5$
First, choose points (x,y) that satisfy each equation.
$begin{cases}
x - y = 4, & text{(6,2)} \
x - y = 6, & text{(10,4)}
end{cases}$
Then, proceed as usual
$Ax = begin{bmatrix}
1 & 6 \
1 & 10 \
end{bmatrix} begin{bmatrix}
b \
m \
end{bmatrix} = begin{bmatrix}
2 \
4 \
end{bmatrix}$
$begin{bmatrix}
b \
m \
end{bmatrix} =begin{bmatrix}
5 \
1/2 \
end{bmatrix}$
$y = 1/2x + 5$
edited Aug 2 '16 at 1:40
answered Aug 2 '16 at 0:00
Tomek Dobrzynski
392
392
add a comment |
add a comment |
Problem statement: underdetermined system
Start with the linear system
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & -1 \
1 & -1 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
4 \
6
end{array}
right]
%
end{align}
$$
The system has matrix rank $rho = 1$; therefore, if a solution exists, it will not be unique.
Provided $bnotin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$, we are guaranteed a least squares solution
$$
x_{LS} = left{ xinmathbb{C}^{2} colon lVert mathbf{A} x_{LS} - b rVert_{2}^{2} text{ is minimized} right}
tag{1}
$$
Subspace resolution
By inspection, we see that the row space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A}^{*} right)} oplus
color{red}{mathcal{N} left( mathbf{A} right)}
=
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]} oplus
color{red}{left[
begin{array}{c}
1 \
1
end{array}
right]}
$$
The column space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A} right)} oplus
color{red}{mathcal{N} left( mathbf{A}^{*} right)}
=
color{blue}{left[
begin{array}{c}
1 \
1
end{array}
right]} oplus
color{red}{left[
begin{array}{r}
-1 \
1
end{array}
right]}
$$
The coloring indicates vectors in the $color{blue}{range}$ space and the $color{red}{null}$ space.
Finding the least squares solution
Since there is only one vector in $color{blue}{mathcal{R} left( mathbf{A}^{*} right)}$, the solution vector will have the form
$$
color{blue}{x_{LS}} = alpha
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
$$
The goal is to find the constant $alpha$ to minimize (1):
$$
color{red}{r}^{2} = color{red}{r} cdot color{red}{r} =
lVert
color{blue}{mathbf{A} x_{LS}} - b
rVert_{2}^{2}
=
8 alpha ^2-40 alpha +52
$$
The minimum of the polynomial is at
$$
alpha = frac{5}{2}
$$
Least squares solution
The set of least squares minimizers in (1) is then the affine set given by
$$
x_{LS} = frac{5}{2}
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
+
xi
color{red}{left[
begin{array}{r}
1 \
1
end{array}
right]}, qquad xiinmathbb{C}
$$
The plot below shows how the total error $lVert mathbf{A} x_{LS} - b rVert_{2}^{2}$ varies with the fit parameters. The blue dot is the particular solution, the dashed line homogeneous solution as well as the $0$ contour - the exact solution.
Addendum: Existence of the Least Squares Solution
To address the insightful question of @RodrigodeAzevedo, consider the linear system:
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
0 \
1
end{array}
right]
%
end{align}
$$
The data vector $b$ is entirely in the null space of $mathbf{A}^{*}$:
$bin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$
As pointed out, the system matrix has the singular value decomposition. One instance is:
$$mathbf{A} = mathbf{U}, Sigma, mathbf{V}^{*} = mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2}$$
and the concomitant pseudoinverse,
$$mathbf{A}^{dagger} = mathbf{V}, Sigma^{dagger} mathbf{U}^{*} =
mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2} = mathbf{A}$$
Following least squares canon, the particular solution to the least squares problem is computed as
$$
color{blue}{x_{LS}} = mathbf{A}^{dagger} b =
color{red}{left[
begin{array}{c}
0 \
0 \
end{array}
right]}
qquad RightarrowLeftarrow
$$
The color collision (null space [red] = range space [blue]) indicates a problem. There is no component of a particular solution vector in a range space!
Mathematicians habitually exclude the $0$ vector a solution to linear problems.
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
add a comment |
Problem statement: underdetermined system
Start with the linear system
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & -1 \
1 & -1 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
4 \
6
end{array}
right]
%
end{align}
$$
The system has matrix rank $rho = 1$; therefore, if a solution exists, it will not be unique.
Provided $bnotin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$, we are guaranteed a least squares solution
$$
x_{LS} = left{ xinmathbb{C}^{2} colon lVert mathbf{A} x_{LS} - b rVert_{2}^{2} text{ is minimized} right}
tag{1}
$$
Subspace resolution
By inspection, we see that the row space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A}^{*} right)} oplus
color{red}{mathcal{N} left( mathbf{A} right)}
=
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]} oplus
color{red}{left[
begin{array}{c}
1 \
1
end{array}
right]}
$$
The column space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A} right)} oplus
color{red}{mathcal{N} left( mathbf{A}^{*} right)}
=
color{blue}{left[
begin{array}{c}
1 \
1
end{array}
right]} oplus
color{red}{left[
begin{array}{r}
-1 \
1
end{array}
right]}
$$
The coloring indicates vectors in the $color{blue}{range}$ space and the $color{red}{null}$ space.
Finding the least squares solution
Since there is only one vector in $color{blue}{mathcal{R} left( mathbf{A}^{*} right)}$, the solution vector will have the form
$$
color{blue}{x_{LS}} = alpha
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
$$
The goal is to find the constant $alpha$ to minimize (1):
$$
color{red}{r}^{2} = color{red}{r} cdot color{red}{r} =
lVert
color{blue}{mathbf{A} x_{LS}} - b
rVert_{2}^{2}
=
8 alpha ^2-40 alpha +52
$$
The minimum of the polynomial is at
$$
alpha = frac{5}{2}
$$
Least squares solution
The set of least squares minimizers in (1) is then the affine set given by
$$
x_{LS} = frac{5}{2}
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
+
xi
color{red}{left[
begin{array}{r}
1 \
1
end{array}
right]}, qquad xiinmathbb{C}
$$
The plot below shows how the total error $lVert mathbf{A} x_{LS} - b rVert_{2}^{2}$ varies with the fit parameters. The blue dot is the particular solution, the dashed line homogeneous solution as well as the $0$ contour - the exact solution.
Addendum: Existence of the Least Squares Solution
To address the insightful question of @RodrigodeAzevedo, consider the linear system:
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
0 \
1
end{array}
right]
%
end{align}
$$
The data vector $b$ is entirely in the null space of $mathbf{A}^{*}$:
$bin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$
As pointed out, the system matrix has the singular value decomposition. One instance is:
$$mathbf{A} = mathbf{U}, Sigma, mathbf{V}^{*} = mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2}$$
and the concomitant pseudoinverse,
$$mathbf{A}^{dagger} = mathbf{V}, Sigma^{dagger} mathbf{U}^{*} =
mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2} = mathbf{A}$$
Following least squares canon, the particular solution to the least squares problem is computed as
$$
color{blue}{x_{LS}} = mathbf{A}^{dagger} b =
color{red}{left[
begin{array}{c}
0 \
0 \
end{array}
right]}
qquad RightarrowLeftarrow
$$
The color collision (null space [red] = range space [blue]) indicates a problem. There is no component of a particular solution vector in a range space!
Mathematicians habitually exclude the $0$ vector a solution to linear problems.
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
add a comment |
Problem statement: underdetermined system
Start with the linear system
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & -1 \
1 & -1 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
4 \
6
end{array}
right]
%
end{align}
$$
The system has matrix rank $rho = 1$; therefore, if a solution exists, it will not be unique.
Provided $bnotin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$, we are guaranteed a least squares solution
$$
x_{LS} = left{ xinmathbb{C}^{2} colon lVert mathbf{A} x_{LS} - b rVert_{2}^{2} text{ is minimized} right}
tag{1}
$$
Subspace resolution
By inspection, we see that the row space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A}^{*} right)} oplus
color{red}{mathcal{N} left( mathbf{A} right)}
=
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]} oplus
color{red}{left[
begin{array}{c}
1 \
1
end{array}
right]}
$$
The column space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A} right)} oplus
color{red}{mathcal{N} left( mathbf{A}^{*} right)}
=
color{blue}{left[
begin{array}{c}
1 \
1
end{array}
right]} oplus
color{red}{left[
begin{array}{r}
-1 \
1
end{array}
right]}
$$
The coloring indicates vectors in the $color{blue}{range}$ space and the $color{red}{null}$ space.
Finding the least squares solution
Since there is only one vector in $color{blue}{mathcal{R} left( mathbf{A}^{*} right)}$, the solution vector will have the form
$$
color{blue}{x_{LS}} = alpha
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
$$
The goal is to find the constant $alpha$ to minimize (1):
$$
color{red}{r}^{2} = color{red}{r} cdot color{red}{r} =
lVert
color{blue}{mathbf{A} x_{LS}} - b
rVert_{2}^{2}
=
8 alpha ^2-40 alpha +52
$$
The minimum of the polynomial is at
$$
alpha = frac{5}{2}
$$
Least squares solution
The set of least squares minimizers in (1) is then the affine set given by
$$
x_{LS} = frac{5}{2}
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
+
xi
color{red}{left[
begin{array}{r}
1 \
1
end{array}
right]}, qquad xiinmathbb{C}
$$
The plot below shows how the total error $lVert mathbf{A} x_{LS} - b rVert_{2}^{2}$ varies with the fit parameters. The blue dot is the particular solution, the dashed line homogeneous solution as well as the $0$ contour - the exact solution.
Addendum: Existence of the Least Squares Solution
To address the insightful question of @RodrigodeAzevedo, consider the linear system:
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
0 \
1
end{array}
right]
%
end{align}
$$
The data vector $b$ is entirely in the null space of $mathbf{A}^{*}$:
$bin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$
As pointed out, the system matrix has the singular value decomposition. One instance is:
$$mathbf{A} = mathbf{U}, Sigma, mathbf{V}^{*} = mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2}$$
and the concomitant pseudoinverse,
$$mathbf{A}^{dagger} = mathbf{V}, Sigma^{dagger} mathbf{U}^{*} =
mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2} = mathbf{A}$$
Following least squares canon, the particular solution to the least squares problem is computed as
$$
color{blue}{x_{LS}} = mathbf{A}^{dagger} b =
color{red}{left[
begin{array}{c}
0 \
0 \
end{array}
right]}
qquad RightarrowLeftarrow
$$
The color collision (null space [red] = range space [blue]) indicates a problem. There is no component of a particular solution vector in a range space!
Mathematicians habitually exclude the $0$ vector a solution to linear problems.
Problem statement: underdetermined system
Start with the linear system
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & -1 \
1 & -1 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
4 \
6
end{array}
right]
%
end{align}
$$
The system has matrix rank $rho = 1$; therefore, if a solution exists, it will not be unique.
Provided $bnotin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$, we are guaranteed a least squares solution
$$
x_{LS} = left{ xinmathbb{C}^{2} colon lVert mathbf{A} x_{LS} - b rVert_{2}^{2} text{ is minimized} right}
tag{1}
$$
Subspace resolution
By inspection, we see that the row space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A}^{*} right)} oplus
color{red}{mathcal{N} left( mathbf{A} right)}
=
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]} oplus
color{red}{left[
begin{array}{c}
1 \
1
end{array}
right]}
$$
The column space is resolved as
$$
color{blue}{mathcal{R} left( mathbf{A} right)} oplus
color{red}{mathcal{N} left( mathbf{A}^{*} right)}
=
color{blue}{left[
begin{array}{c}
1 \
1
end{array}
right]} oplus
color{red}{left[
begin{array}{r}
-1 \
1
end{array}
right]}
$$
The coloring indicates vectors in the $color{blue}{range}$ space and the $color{red}{null}$ space.
Finding the least squares solution
Since there is only one vector in $color{blue}{mathcal{R} left( mathbf{A}^{*} right)}$, the solution vector will have the form
$$
color{blue}{x_{LS}} = alpha
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
$$
The goal is to find the constant $alpha$ to minimize (1):
$$
color{red}{r}^{2} = color{red}{r} cdot color{red}{r} =
lVert
color{blue}{mathbf{A} x_{LS}} - b
rVert_{2}^{2}
=
8 alpha ^2-40 alpha +52
$$
The minimum of the polynomial is at
$$
alpha = frac{5}{2}
$$
Least squares solution
The set of least squares minimizers in (1) is then the affine set given by
$$
x_{LS} = frac{5}{2}
color{blue}{left[
begin{array}{r}
1 \
-1
end{array}
right]}
+
xi
color{red}{left[
begin{array}{r}
1 \
1
end{array}
right]}, qquad xiinmathbb{C}
$$
The plot below shows how the total error $lVert mathbf{A} x_{LS} - b rVert_{2}^{2}$ varies with the fit parameters. The blue dot is the particular solution, the dashed line homogeneous solution as well as the $0$ contour - the exact solution.
Addendum: Existence of the Least Squares Solution
To address the insightful question of @RodrigodeAzevedo, consider the linear system:
$$
begin{align}
mathbf{A} x &= b \
%
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
%
left[
begin{array}{c}
x \
y
end{array}
right]
%
&=
%
left[
begin{array}{c}
0 \
1
end{array}
right]
%
end{align}
$$
The data vector $b$ is entirely in the null space of $mathbf{A}^{*}$:
$bin color{red}{mathcal{N} left( mathbf{A}^{*} right)}$
As pointed out, the system matrix has the singular value decomposition. One instance is:
$$mathbf{A} = mathbf{U}, Sigma, mathbf{V}^{*} = mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2}$$
and the concomitant pseudoinverse,
$$mathbf{A}^{dagger} = mathbf{V}, Sigma^{dagger} mathbf{U}^{*} =
mathbf{I}_{2}
left[
begin{array}{cc}
1 & 0 \
0 & 0 \
end{array}
right]
mathbf{I}_{2} = mathbf{A}$$
Following least squares canon, the particular solution to the least squares problem is computed as
$$
color{blue}{x_{LS}} = mathbf{A}^{dagger} b =
color{red}{left[
begin{array}{c}
0 \
0 \
end{array}
right]}
qquad RightarrowLeftarrow
$$
The color collision (null space [red] = range space [blue]) indicates a problem. There is no component of a particular solution vector in a range space!
Mathematicians habitually exclude the $0$ vector a solution to linear problems.
edited Dec 1 at 0:31
answered Mar 14 '17 at 3:52
dantopa
6,41932042
6,41932042
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
add a comment |
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
If the RHS of the normal equations is zero, there will be infinitely many solutions when $bf A$ does not have full column rank. Lastly, I do not understand your hostility towards the zero solution.
– Rodrigo de Azevedo
Dec 1 at 9:19
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
@Rodrigo de Azevedo: Here is a picture in 3D: math.stackexchange.com/questions/2253443/…. The least squares solution $$ x_{LS} = color{blue}{mathbf{A}^{+} b} + color{red}{ left( mathbf{I}_{n} - mathbf{A}^{+} mathbf{A} right) y}, quad y in mathbb{C}^{n} tag{1} $$ The topology of the particular solution is a point (finite), and the homogeneous solution is a hyperplane. The pseudoinverse solution is where the homogenous solution intersects the range space. There is no intersection here.
– dantopa
Dec 1 at 17:33
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
Why even bring the pseudoinverse to the discussion?
– Rodrigo de Azevedo
Dec 1 at 17:36
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
The pseudoinverse provides the most general form of the particular solution (the range space component).
– dantopa
Dec 1 at 18:27
add a comment |
$$A = left [begin{array}{cc}
1 & -1 \
1 & -1 \
end{array} right],
quad b = begin{bmatrix}4\6 end{bmatrix} $$
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
add a comment |
$$A = left [begin{array}{cc}
1 & -1 \
1 & -1 \
end{array} right],
quad b = begin{bmatrix}4\6 end{bmatrix} $$
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
add a comment |
$$A = left [begin{array}{cc}
1 & -1 \
1 & -1 \
end{array} right],
quad b = begin{bmatrix}4\6 end{bmatrix} $$
$$A = left [begin{array}{cc}
1 & -1 \
1 & -1 \
end{array} right],
quad b = begin{bmatrix}4\6 end{bmatrix} $$
answered Aug 1 '16 at 23:12
krenzland
92
92
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
add a comment |
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
The thing is tho that the inverse does not exist
– Yusha
Aug 2 '16 at 3:53
add a comment |
The linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
has no solution. Left-multiplying both sides by $begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix}^T$, we obtain the normal equations
$$begin{bmatrix} 2 & -2\ -2 & 2end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 10 \ -10end{bmatrix}$$
Dividing both sides by $2$ and removing the redundant equation,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
The least-squares solution is a solution to the normal equations, not to the original linear system.
add a comment |
The linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
has no solution. Left-multiplying both sides by $begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix}^T$, we obtain the normal equations
$$begin{bmatrix} 2 & -2\ -2 & 2end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 10 \ -10end{bmatrix}$$
Dividing both sides by $2$ and removing the redundant equation,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
The least-squares solution is a solution to the normal equations, not to the original linear system.
add a comment |
The linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
has no solution. Left-multiplying both sides by $begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix}^T$, we obtain the normal equations
$$begin{bmatrix} 2 & -2\ -2 & 2end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 10 \ -10end{bmatrix}$$
Dividing both sides by $2$ and removing the redundant equation,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
The least-squares solution is a solution to the normal equations, not to the original linear system.
The linear system
$$begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 4 \ 6end{bmatrix}$$
has no solution. Left-multiplying both sides by $begin{bmatrix} 1 & -1\ 1 & -1end{bmatrix}^T$, we obtain the normal equations
$$begin{bmatrix} 2 & -2\ -2 & 2end{bmatrix} begin{bmatrix} x\ yend{bmatrix} = begin{bmatrix} 10 \ -10end{bmatrix}$$
Dividing both sides by $2$ and removing the redundant equation,
$$x - y = 5$$
Thus, there are infinitely many least-squares solutions. One of them is
$$begin{bmatrix} hat x\ hat yend{bmatrix} = begin{bmatrix} 6\ 1end{bmatrix}$$
The least-squares solution is a solution to the normal equations, not to the original linear system.
answered Aug 1 '16 at 23:58
Rodrigo de Azevedo
12.8k41855
12.8k41855
add a comment |
add a comment |
Your matrix is just the coefficients of your system of equations. In this case
$$ x-y = 4 $$
$$ x-y = 6 $$
leads to
$$
begin{bmatrix}
1 & -1 \ 1 & -1
end{bmatrix}
begin{bmatrix}
x \ y
end{bmatrix}
=
begin{bmatrix}
4 \ 6
end{bmatrix}
$$
but you should see that there is no solution to this since you can't have $x-y$ be both $4$ and $6$...
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
|
show 6 more comments
Your matrix is just the coefficients of your system of equations. In this case
$$ x-y = 4 $$
$$ x-y = 6 $$
leads to
$$
begin{bmatrix}
1 & -1 \ 1 & -1
end{bmatrix}
begin{bmatrix}
x \ y
end{bmatrix}
=
begin{bmatrix}
4 \ 6
end{bmatrix}
$$
but you should see that there is no solution to this since you can't have $x-y$ be both $4$ and $6$...
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
|
show 6 more comments
Your matrix is just the coefficients of your system of equations. In this case
$$ x-y = 4 $$
$$ x-y = 6 $$
leads to
$$
begin{bmatrix}
1 & -1 \ 1 & -1
end{bmatrix}
begin{bmatrix}
x \ y
end{bmatrix}
=
begin{bmatrix}
4 \ 6
end{bmatrix}
$$
but you should see that there is no solution to this since you can't have $x-y$ be both $4$ and $6$...
Your matrix is just the coefficients of your system of equations. In this case
$$ x-y = 4 $$
$$ x-y = 6 $$
leads to
$$
begin{bmatrix}
1 & -1 \ 1 & -1
end{bmatrix}
begin{bmatrix}
x \ y
end{bmatrix}
=
begin{bmatrix}
4 \ 6
end{bmatrix}
$$
but you should see that there is no solution to this since you can't have $x-y$ be both $4$ and $6$...
answered Aug 1 '16 at 23:13
Carser
2,4494927
2,4494927
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
|
show 6 more comments
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
1
1
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
The OP is looking for the least-squares solution, which always exist.
– Rodrigo de Azevedo
Aug 1 '16 at 23:40
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
@RodrigodeAzevedo What would it be in this case, and is it unique? Yes maybe you can identify a LS solution, but what would the meaning of it be here with only two points? The OP's suggestion of inv(A'*A)*A'*b results in [NaN;NaN] according to MATLAB.
– Carser
Aug 1 '16 at 23:45
1
1
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@Carser it's not necessary for the system to have a solution. This is the essence of least squares. Find the $(x,y)$ that minimises the distance between the vectors $(1,1)'x +(-1,-1)'y$ and $(4,6)$
– theoGR
Aug 2 '16 at 0:33
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
@theoGR You are of course correct, you can find a LS fitting here, but what is it worth when there are only two datum and a singular matrix?
– Carser
Aug 2 '16 at 0:41
1
1
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
@Rodrigo de Azevedo: There is no (non-trivial) least squares solution when the data vector s in the null space. See, for example, math.stackexchange.com/questions/2244851/…
– dantopa
Nov 22 at 17:48
|
show 6 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1878383%2ffind-the-least-squares-solution-for-rank-deficient-system%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
The matrices are right there - for A, just pull out the coefficients of the variables on the left-hand side of each linear equation and line them up, and for B, get the constants on the right-hand side.
– ConMan
Aug 1 '16 at 23:05
the only good answer to this problem was downvoted
– Ryan Howe
Oct 16 at 22:10