Sum of Symmetric Positive Definite Matrix and Scalar of Identity
$begingroup$
If $A$ is an $ntimes n$ symmetric positive definite matrix with the smallest eigenvalue $lambda$, then for any $mu>-lambda$, $A+mu I$ is positive definite.
I am trying to show this, but I am stuck on one part. Here is what I have so far:
$$
begin{align*}
langle x,left(A+mu Iright)xrangle&=langle x,Ax+mu xrangle\
&=langle x,Axrangle+langle x,mu xrangle\
&>0+mulangle x,xrangle\
&>-lambdalangle x,xrangle.
end{align*}
$$
I'm stuck on showing that $mulangle x,xrangle$ is positive because I only know that $mu>-lambda$. Any help would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors positive-definite
$endgroup$
add a comment |
$begingroup$
If $A$ is an $ntimes n$ symmetric positive definite matrix with the smallest eigenvalue $lambda$, then for any $mu>-lambda$, $A+mu I$ is positive definite.
I am trying to show this, but I am stuck on one part. Here is what I have so far:
$$
begin{align*}
langle x,left(A+mu Iright)xrangle&=langle x,Ax+mu xrangle\
&=langle x,Axrangle+langle x,mu xrangle\
&>0+mulangle x,xrangle\
&>-lambdalangle x,xrangle.
end{align*}
$$
I'm stuck on showing that $mulangle x,xrangle$ is positive because I only know that $mu>-lambda$. Any help would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors positive-definite
$endgroup$
2
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22
add a comment |
$begingroup$
If $A$ is an $ntimes n$ symmetric positive definite matrix with the smallest eigenvalue $lambda$, then for any $mu>-lambda$, $A+mu I$ is positive definite.
I am trying to show this, but I am stuck on one part. Here is what I have so far:
$$
begin{align*}
langle x,left(A+mu Iright)xrangle&=langle x,Ax+mu xrangle\
&=langle x,Axrangle+langle x,mu xrangle\
&>0+mulangle x,xrangle\
&>-lambdalangle x,xrangle.
end{align*}
$$
I'm stuck on showing that $mulangle x,xrangle$ is positive because I only know that $mu>-lambda$. Any help would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors positive-definite
$endgroup$
If $A$ is an $ntimes n$ symmetric positive definite matrix with the smallest eigenvalue $lambda$, then for any $mu>-lambda$, $A+mu I$ is positive definite.
I am trying to show this, but I am stuck on one part. Here is what I have so far:
$$
begin{align*}
langle x,left(A+mu Iright)xrangle&=langle x,Ax+mu xrangle\
&=langle x,Axrangle+langle x,mu xrangle\
&>0+mulangle x,xrangle\
&>-lambdalangle x,xrangle.
end{align*}
$$
I'm stuck on showing that $mulangle x,xrangle$ is positive because I only know that $mu>-lambda$. Any help would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors positive-definite
linear-algebra matrices eigenvalues-eigenvectors positive-definite
asked Dec 10 '18 at 18:18
JakeJake
415312
415312
2
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22
add a comment |
2
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22
2
2
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
If $A$ is a positive definite symmetric matrix with smallest eigenvalue $lambda$, then for all vectors $x$ we have
$langle x, Ax rangle ge lambda langle x, x rangle; tag 1$
the easiest way I know to see this is to diagonalize $A$ by a suitable orthogonal matrix $O$, which will preserve the inner product:
$langle Oy, Ox rangle = langle y, O^TOx rangle = langle y, Ix rangle = langle y, x rangle, tag 2$
where we have used the fact that
$O^TO = OO^T = I tag 3$
in (2); then we have
$OAO^T = Lambda = text{diag}(lambda_1, lambda_2, ldots, lambda_n), tag 3$
where $lambda_1, lambda_2, ldots, lambda_n$ are the eigenvalues of $A$. It is well-known that $A$ is also possessed of an orthonormal eigenbasis of vectors $e_i$ such that
$A e_i = lambda_i e_i, ; 1 le i le n; tag 4$
we may then write
$x = displaystyle sum_1^n x_i e_i, tag 5$
and
$langle x, Ax rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i Ae_i right rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i lambda_i e_i right rangle = displaystyle sum_{i, j = 1}^n x_ix_j langle e_i, lambda_j e_j rangle$
$= displaystyle sum_{i, j = 1}^n x_ix_j lambda_j langle e_i,e_j rangle = sum_{i, j = 1}^n x_ix_j lambda_j delta_{ij} = sum_1^n lambda_i x_i^2; tag 6$
now if
$lambda = min {lambda_1, lambda_2, ldots, lambda_n } > 0 tag 7$
is the least eigenvalue, then (6) yields
$langle x, Ax rangle = displaystyle sum_1^n lambda_i x_i^2 ge sum_1^n lambda x_i^2 = lambda sum_1^n x_i^2 = lambda langle x, x rangle; tag 8$
therefore,
$langle x, (A + mu I)x rangle = langle x, Ax rangle + mu langle x, x rangle ge lambda langle x, x rangle + mu langle x, x rangle = (lambda + mu) langle x, x rangle; tag 9$
since
$mu > - lambda Longleftrightarrow mu + lambda > 0, tag{10}$
(9) becomes
$langle x, (A + mu I)x rangle ge (mu + lambda ) langle x, x rangle > 0, tag{11}$
which shows that $A + mu I$ is positive definite, the desired result. $OEDelta$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3034283%2fsum-of-symmetric-positive-definite-matrix-and-scalar-of-identity%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If $A$ is a positive definite symmetric matrix with smallest eigenvalue $lambda$, then for all vectors $x$ we have
$langle x, Ax rangle ge lambda langle x, x rangle; tag 1$
the easiest way I know to see this is to diagonalize $A$ by a suitable orthogonal matrix $O$, which will preserve the inner product:
$langle Oy, Ox rangle = langle y, O^TOx rangle = langle y, Ix rangle = langle y, x rangle, tag 2$
where we have used the fact that
$O^TO = OO^T = I tag 3$
in (2); then we have
$OAO^T = Lambda = text{diag}(lambda_1, lambda_2, ldots, lambda_n), tag 3$
where $lambda_1, lambda_2, ldots, lambda_n$ are the eigenvalues of $A$. It is well-known that $A$ is also possessed of an orthonormal eigenbasis of vectors $e_i$ such that
$A e_i = lambda_i e_i, ; 1 le i le n; tag 4$
we may then write
$x = displaystyle sum_1^n x_i e_i, tag 5$
and
$langle x, Ax rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i Ae_i right rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i lambda_i e_i right rangle = displaystyle sum_{i, j = 1}^n x_ix_j langle e_i, lambda_j e_j rangle$
$= displaystyle sum_{i, j = 1}^n x_ix_j lambda_j langle e_i,e_j rangle = sum_{i, j = 1}^n x_ix_j lambda_j delta_{ij} = sum_1^n lambda_i x_i^2; tag 6$
now if
$lambda = min {lambda_1, lambda_2, ldots, lambda_n } > 0 tag 7$
is the least eigenvalue, then (6) yields
$langle x, Ax rangle = displaystyle sum_1^n lambda_i x_i^2 ge sum_1^n lambda x_i^2 = lambda sum_1^n x_i^2 = lambda langle x, x rangle; tag 8$
therefore,
$langle x, (A + mu I)x rangle = langle x, Ax rangle + mu langle x, x rangle ge lambda langle x, x rangle + mu langle x, x rangle = (lambda + mu) langle x, x rangle; tag 9$
since
$mu > - lambda Longleftrightarrow mu + lambda > 0, tag{10}$
(9) becomes
$langle x, (A + mu I)x rangle ge (mu + lambda ) langle x, x rangle > 0, tag{11}$
which shows that $A + mu I$ is positive definite, the desired result. $OEDelta$.
$endgroup$
add a comment |
$begingroup$
If $A$ is a positive definite symmetric matrix with smallest eigenvalue $lambda$, then for all vectors $x$ we have
$langle x, Ax rangle ge lambda langle x, x rangle; tag 1$
the easiest way I know to see this is to diagonalize $A$ by a suitable orthogonal matrix $O$, which will preserve the inner product:
$langle Oy, Ox rangle = langle y, O^TOx rangle = langle y, Ix rangle = langle y, x rangle, tag 2$
where we have used the fact that
$O^TO = OO^T = I tag 3$
in (2); then we have
$OAO^T = Lambda = text{diag}(lambda_1, lambda_2, ldots, lambda_n), tag 3$
where $lambda_1, lambda_2, ldots, lambda_n$ are the eigenvalues of $A$. It is well-known that $A$ is also possessed of an orthonormal eigenbasis of vectors $e_i$ such that
$A e_i = lambda_i e_i, ; 1 le i le n; tag 4$
we may then write
$x = displaystyle sum_1^n x_i e_i, tag 5$
and
$langle x, Ax rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i Ae_i right rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i lambda_i e_i right rangle = displaystyle sum_{i, j = 1}^n x_ix_j langle e_i, lambda_j e_j rangle$
$= displaystyle sum_{i, j = 1}^n x_ix_j lambda_j langle e_i,e_j rangle = sum_{i, j = 1}^n x_ix_j lambda_j delta_{ij} = sum_1^n lambda_i x_i^2; tag 6$
now if
$lambda = min {lambda_1, lambda_2, ldots, lambda_n } > 0 tag 7$
is the least eigenvalue, then (6) yields
$langle x, Ax rangle = displaystyle sum_1^n lambda_i x_i^2 ge sum_1^n lambda x_i^2 = lambda sum_1^n x_i^2 = lambda langle x, x rangle; tag 8$
therefore,
$langle x, (A + mu I)x rangle = langle x, Ax rangle + mu langle x, x rangle ge lambda langle x, x rangle + mu langle x, x rangle = (lambda + mu) langle x, x rangle; tag 9$
since
$mu > - lambda Longleftrightarrow mu + lambda > 0, tag{10}$
(9) becomes
$langle x, (A + mu I)x rangle ge (mu + lambda ) langle x, x rangle > 0, tag{11}$
which shows that $A + mu I$ is positive definite, the desired result. $OEDelta$.
$endgroup$
add a comment |
$begingroup$
If $A$ is a positive definite symmetric matrix with smallest eigenvalue $lambda$, then for all vectors $x$ we have
$langle x, Ax rangle ge lambda langle x, x rangle; tag 1$
the easiest way I know to see this is to diagonalize $A$ by a suitable orthogonal matrix $O$, which will preserve the inner product:
$langle Oy, Ox rangle = langle y, O^TOx rangle = langle y, Ix rangle = langle y, x rangle, tag 2$
where we have used the fact that
$O^TO = OO^T = I tag 3$
in (2); then we have
$OAO^T = Lambda = text{diag}(lambda_1, lambda_2, ldots, lambda_n), tag 3$
where $lambda_1, lambda_2, ldots, lambda_n$ are the eigenvalues of $A$. It is well-known that $A$ is also possessed of an orthonormal eigenbasis of vectors $e_i$ such that
$A e_i = lambda_i e_i, ; 1 le i le n; tag 4$
we may then write
$x = displaystyle sum_1^n x_i e_i, tag 5$
and
$langle x, Ax rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i Ae_i right rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i lambda_i e_i right rangle = displaystyle sum_{i, j = 1}^n x_ix_j langle e_i, lambda_j e_j rangle$
$= displaystyle sum_{i, j = 1}^n x_ix_j lambda_j langle e_i,e_j rangle = sum_{i, j = 1}^n x_ix_j lambda_j delta_{ij} = sum_1^n lambda_i x_i^2; tag 6$
now if
$lambda = min {lambda_1, lambda_2, ldots, lambda_n } > 0 tag 7$
is the least eigenvalue, then (6) yields
$langle x, Ax rangle = displaystyle sum_1^n lambda_i x_i^2 ge sum_1^n lambda x_i^2 = lambda sum_1^n x_i^2 = lambda langle x, x rangle; tag 8$
therefore,
$langle x, (A + mu I)x rangle = langle x, Ax rangle + mu langle x, x rangle ge lambda langle x, x rangle + mu langle x, x rangle = (lambda + mu) langle x, x rangle; tag 9$
since
$mu > - lambda Longleftrightarrow mu + lambda > 0, tag{10}$
(9) becomes
$langle x, (A + mu I)x rangle ge (mu + lambda ) langle x, x rangle > 0, tag{11}$
which shows that $A + mu I$ is positive definite, the desired result. $OEDelta$.
$endgroup$
If $A$ is a positive definite symmetric matrix with smallest eigenvalue $lambda$, then for all vectors $x$ we have
$langle x, Ax rangle ge lambda langle x, x rangle; tag 1$
the easiest way I know to see this is to diagonalize $A$ by a suitable orthogonal matrix $O$, which will preserve the inner product:
$langle Oy, Ox rangle = langle y, O^TOx rangle = langle y, Ix rangle = langle y, x rangle, tag 2$
where we have used the fact that
$O^TO = OO^T = I tag 3$
in (2); then we have
$OAO^T = Lambda = text{diag}(lambda_1, lambda_2, ldots, lambda_n), tag 3$
where $lambda_1, lambda_2, ldots, lambda_n$ are the eigenvalues of $A$. It is well-known that $A$ is also possessed of an orthonormal eigenbasis of vectors $e_i$ such that
$A e_i = lambda_i e_i, ; 1 le i le n; tag 4$
we may then write
$x = displaystyle sum_1^n x_i e_i, tag 5$
and
$langle x, Ax rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i Ae_i right rangle = left langle displaystyle sum_1^n x_ie_i, sum_1^n x_i lambda_i e_i right rangle = displaystyle sum_{i, j = 1}^n x_ix_j langle e_i, lambda_j e_j rangle$
$= displaystyle sum_{i, j = 1}^n x_ix_j lambda_j langle e_i,e_j rangle = sum_{i, j = 1}^n x_ix_j lambda_j delta_{ij} = sum_1^n lambda_i x_i^2; tag 6$
now if
$lambda = min {lambda_1, lambda_2, ldots, lambda_n } > 0 tag 7$
is the least eigenvalue, then (6) yields
$langle x, Ax rangle = displaystyle sum_1^n lambda_i x_i^2 ge sum_1^n lambda x_i^2 = lambda sum_1^n x_i^2 = lambda langle x, x rangle; tag 8$
therefore,
$langle x, (A + mu I)x rangle = langle x, Ax rangle + mu langle x, x rangle ge lambda langle x, x rangle + mu langle x, x rangle = (lambda + mu) langle x, x rangle; tag 9$
since
$mu > - lambda Longleftrightarrow mu + lambda > 0, tag{10}$
(9) becomes
$langle x, (A + mu I)x rangle ge (mu + lambda ) langle x, x rangle > 0, tag{11}$
which shows that $A + mu I$ is positive definite, the desired result. $OEDelta$.
edited Dec 10 '18 at 19:37
answered Dec 10 '18 at 19:16
Robert LewisRobert Lewis
45.6k23065
45.6k23065
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3034283%2fsum-of-symmetric-positive-definite-matrix-and-scalar-of-identity%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
$langle Ax,xrangle ge lambdalangle x,xrangle$ by the assumptions on $A$.
$endgroup$
– DisintegratingByParts
Dec 10 '18 at 18:22