Decomposition of the same matrix with different unitary matrices
Let matrix $M$ be Hermitian
$$M = U_1Lambda U^*_1 qquad qquad (1)$$
$$M = U_2Lambda U^*_2 qquad qquad (2)$$
$U_1$ and $U_2$ are unitary matrices.
How can we prove that we can have $U_1 ne U_2$ , with exactly the same diagonal $Lambda$?
Example
MATLAB code:
M = [4 3-i; 3+i 10]; %Hermitian
[U1, diag]= eig(M); % i.e. M = U1*diag*U1'
U2 = U1;
U2(:,1) = -U2(:,1);
U1*diag*U1'; % gives M
U2*diag*U2'; % also gives M
% U1 is not equal to U2
% U1 =
% 0.8716 - 0.2905i 0.3746 - 0.1249i
% -0.3948 + 0.0000i 0.9188 + 0.0000i
% U2 =
% -0.8716 + 0.2905i 0.3746 - 0.1249i
% 0.3948 + 0.0000i 0.9188 + 0.0000i
% U1*diag*U1' and U2*diag*U2' % they both return M
% ans =
% 4.0000 - 0.0000i 3.0000 - 1.0000i
% 3.0000 + 1.0000i 10.0000 + 0.0000i
eigenvalues-eigenvectors matrix-decomposition
add a comment |
Let matrix $M$ be Hermitian
$$M = U_1Lambda U^*_1 qquad qquad (1)$$
$$M = U_2Lambda U^*_2 qquad qquad (2)$$
$U_1$ and $U_2$ are unitary matrices.
How can we prove that we can have $U_1 ne U_2$ , with exactly the same diagonal $Lambda$?
Example
MATLAB code:
M = [4 3-i; 3+i 10]; %Hermitian
[U1, diag]= eig(M); % i.e. M = U1*diag*U1'
U2 = U1;
U2(:,1) = -U2(:,1);
U1*diag*U1'; % gives M
U2*diag*U2'; % also gives M
% U1 is not equal to U2
% U1 =
% 0.8716 - 0.2905i 0.3746 - 0.1249i
% -0.3948 + 0.0000i 0.9188 + 0.0000i
% U2 =
% -0.8716 + 0.2905i 0.3746 - 0.1249i
% 0.3948 + 0.0000i 0.9188 + 0.0000i
% U1*diag*U1' and U2*diag*U2' % they both return M
% ans =
% 4.0000 - 0.0000i 3.0000 - 1.0000i
% 3.0000 + 1.0000i 10.0000 + 0.0000i
eigenvalues-eigenvectors matrix-decomposition
1
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13
add a comment |
Let matrix $M$ be Hermitian
$$M = U_1Lambda U^*_1 qquad qquad (1)$$
$$M = U_2Lambda U^*_2 qquad qquad (2)$$
$U_1$ and $U_2$ are unitary matrices.
How can we prove that we can have $U_1 ne U_2$ , with exactly the same diagonal $Lambda$?
Example
MATLAB code:
M = [4 3-i; 3+i 10]; %Hermitian
[U1, diag]= eig(M); % i.e. M = U1*diag*U1'
U2 = U1;
U2(:,1) = -U2(:,1);
U1*diag*U1'; % gives M
U2*diag*U2'; % also gives M
% U1 is not equal to U2
% U1 =
% 0.8716 - 0.2905i 0.3746 - 0.1249i
% -0.3948 + 0.0000i 0.9188 + 0.0000i
% U2 =
% -0.8716 + 0.2905i 0.3746 - 0.1249i
% 0.3948 + 0.0000i 0.9188 + 0.0000i
% U1*diag*U1' and U2*diag*U2' % they both return M
% ans =
% 4.0000 - 0.0000i 3.0000 - 1.0000i
% 3.0000 + 1.0000i 10.0000 + 0.0000i
eigenvalues-eigenvectors matrix-decomposition
Let matrix $M$ be Hermitian
$$M = U_1Lambda U^*_1 qquad qquad (1)$$
$$M = U_2Lambda U^*_2 qquad qquad (2)$$
$U_1$ and $U_2$ are unitary matrices.
How can we prove that we can have $U_1 ne U_2$ , with exactly the same diagonal $Lambda$?
Example
MATLAB code:
M = [4 3-i; 3+i 10]; %Hermitian
[U1, diag]= eig(M); % i.e. M = U1*diag*U1'
U2 = U1;
U2(:,1) = -U2(:,1);
U1*diag*U1'; % gives M
U2*diag*U2'; % also gives M
% U1 is not equal to U2
% U1 =
% 0.8716 - 0.2905i 0.3746 - 0.1249i
% -0.3948 + 0.0000i 0.9188 + 0.0000i
% U2 =
% -0.8716 + 0.2905i 0.3746 - 0.1249i
% 0.3948 + 0.0000i 0.9188 + 0.0000i
% U1*diag*U1' and U2*diag*U2' % they both return M
% ans =
% 4.0000 - 0.0000i 3.0000 - 1.0000i
% 3.0000 + 1.0000i 10.0000 + 0.0000i
eigenvalues-eigenvectors matrix-decomposition
eigenvalues-eigenvectors matrix-decomposition
asked Nov 29 at 21:37
Kay
113
113
1
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13
add a comment |
1
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13
1
1
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13
add a comment |
1 Answer
1
active
oldest
votes
The eigenvectors of a matrix a determined up to a phase $e^{iphi}$, indeed if ${bf u}$ is an eigenvector of $A$ with eigenvalue $lambda$ then
begin{eqnarray}
A {bf u} &=& lambda {bf u} \
Rightarrow~~~A(e^{iphi}{bf u}) &=& lambda (e^{iphi}{bf u})
end{eqnarray}
that means that $e^{iphi}{bf u}$ is also an eigenvector of $A$ with eigenvalue $lambda$. Also note both ${bf u}$ and $e^{iphi}{bf u}$ have the same norm
$$
require{cancel}
|e^{iphi}{bf u}| = cancelto{1}{|e^{iphi} |}| {bf u}| = | {bf u}|
$$
So if the eigenvectors are normalized (which is actually your case), then after multiplying them by an arbitrary phase they remain normalized!
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019253%2fdecomposition-of-the-same-matrix-with-different-unitary-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
The eigenvectors of a matrix a determined up to a phase $e^{iphi}$, indeed if ${bf u}$ is an eigenvector of $A$ with eigenvalue $lambda$ then
begin{eqnarray}
A {bf u} &=& lambda {bf u} \
Rightarrow~~~A(e^{iphi}{bf u}) &=& lambda (e^{iphi}{bf u})
end{eqnarray}
that means that $e^{iphi}{bf u}$ is also an eigenvector of $A$ with eigenvalue $lambda$. Also note both ${bf u}$ and $e^{iphi}{bf u}$ have the same norm
$$
require{cancel}
|e^{iphi}{bf u}| = cancelto{1}{|e^{iphi} |}| {bf u}| = | {bf u}|
$$
So if the eigenvectors are normalized (which is actually your case), then after multiplying them by an arbitrary phase they remain normalized!
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
add a comment |
The eigenvectors of a matrix a determined up to a phase $e^{iphi}$, indeed if ${bf u}$ is an eigenvector of $A$ with eigenvalue $lambda$ then
begin{eqnarray}
A {bf u} &=& lambda {bf u} \
Rightarrow~~~A(e^{iphi}{bf u}) &=& lambda (e^{iphi}{bf u})
end{eqnarray}
that means that $e^{iphi}{bf u}$ is also an eigenvector of $A$ with eigenvalue $lambda$. Also note both ${bf u}$ and $e^{iphi}{bf u}$ have the same norm
$$
require{cancel}
|e^{iphi}{bf u}| = cancelto{1}{|e^{iphi} |}| {bf u}| = | {bf u}|
$$
So if the eigenvectors are normalized (which is actually your case), then after multiplying them by an arbitrary phase they remain normalized!
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
add a comment |
The eigenvectors of a matrix a determined up to a phase $e^{iphi}$, indeed if ${bf u}$ is an eigenvector of $A$ with eigenvalue $lambda$ then
begin{eqnarray}
A {bf u} &=& lambda {bf u} \
Rightarrow~~~A(e^{iphi}{bf u}) &=& lambda (e^{iphi}{bf u})
end{eqnarray}
that means that $e^{iphi}{bf u}$ is also an eigenvector of $A$ with eigenvalue $lambda$. Also note both ${bf u}$ and $e^{iphi}{bf u}$ have the same norm
$$
require{cancel}
|e^{iphi}{bf u}| = cancelto{1}{|e^{iphi} |}| {bf u}| = | {bf u}|
$$
So if the eigenvectors are normalized (which is actually your case), then after multiplying them by an arbitrary phase they remain normalized!
The eigenvectors of a matrix a determined up to a phase $e^{iphi}$, indeed if ${bf u}$ is an eigenvector of $A$ with eigenvalue $lambda$ then
begin{eqnarray}
A {bf u} &=& lambda {bf u} \
Rightarrow~~~A(e^{iphi}{bf u}) &=& lambda (e^{iphi}{bf u})
end{eqnarray}
that means that $e^{iphi}{bf u}$ is also an eigenvector of $A$ with eigenvalue $lambda$. Also note both ${bf u}$ and $e^{iphi}{bf u}$ have the same norm
$$
require{cancel}
|e^{iphi}{bf u}| = cancelto{1}{|e^{iphi} |}| {bf u}| = | {bf u}|
$$
So if the eigenvectors are normalized (which is actually your case), then after multiplying them by an arbitrary phase they remain normalized!
answered Nov 29 at 23:25
caverac
13.2k21029
13.2k21029
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
add a comment |
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
This does the explanation. I appreciate.
– Kay
Nov 29 at 23:53
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019253%2fdecomposition-of-the-same-matrix-with-different-unitary-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Hint: an eigenvector is still an eigenvector if you multiply it by $-1$
– N74
Nov 29 at 21:43
That's right. I thought of that clause, but I wondered if the explanation is sufficient. Well, it should. Thanks.
– Kay
Nov 29 at 23:13