Put a matrix $A$ in Jordan Normal Form and find a $P$ such that $P^{-1}AP=J$
$begingroup$
I have a linear algebra exam tomorrow and this is a frequent question.
$A=
begin{pmatrix} 4 & 0 & 1 & 0 \ 2 & 2 & 3 & 0 \ -1 & 0 & 2 & 0 \ 4 & 0 & 1 & 2
end{pmatrix}$
$C_T(x)=(x-2)^2(x-3)^2$ so eigenvalues are $2$ and $3$.
$a_2=2$ so the sum of the sizes of Jordan blocks for eigenvalue $2$ equals $2=1+1$
$a_3=2$ so the sum of the sizes of Jordan blocks for eigenvalue $3$ equals $2=1+1$
$\$
$g_2=dimE_2=dim ker (A-2I)$
$A-2I = begin{pmatrix} 2 & 0 & 1 & 0 \ 2 & 0 & 3 & 0 \ -1 & 0 & 0 & 0 \ 4 & 0 & 1 & 0 end{pmatrix}$
$ker(A-2I) = left{ begin{pmatrix} 0 \ y \ 0 \ t end{pmatrix} | y,t in mathbb{R} right} = Span left{ begin{pmatrix} 0 \ 1 \ 0 \ 0 end{pmatrix},begin{pmatrix} 0\0\0\1 end{pmatrix} right}$
Therefore $g_2=2$. So there are 2 Jordan blocks, and they must both be of size $1$.
I put this in Jordan Normal form and got:
$\$ $\$
$J = begin{pmatrix} 3 & 1 & 0 & 0 \ 0 & 3 & 0 & 0 \ 0 & 0 & 2 & 0 \ 0 & 0 & 0 & 2
end{pmatrix}$
How do I find an invertible matrix $P$ such that $P^{-1}AP=J$?
I found the Jordan Normal Form using geometric and algebraic multiplicities along with the minimum polynomial, if that helps at all!
Thanks!
linear-algebra matrices jordan-normal-form
$endgroup$
add a comment |
$begingroup$
I have a linear algebra exam tomorrow and this is a frequent question.
$A=
begin{pmatrix} 4 & 0 & 1 & 0 \ 2 & 2 & 3 & 0 \ -1 & 0 & 2 & 0 \ 4 & 0 & 1 & 2
end{pmatrix}$
$C_T(x)=(x-2)^2(x-3)^2$ so eigenvalues are $2$ and $3$.
$a_2=2$ so the sum of the sizes of Jordan blocks for eigenvalue $2$ equals $2=1+1$
$a_3=2$ so the sum of the sizes of Jordan blocks for eigenvalue $3$ equals $2=1+1$
$\$
$g_2=dimE_2=dim ker (A-2I)$
$A-2I = begin{pmatrix} 2 & 0 & 1 & 0 \ 2 & 0 & 3 & 0 \ -1 & 0 & 0 & 0 \ 4 & 0 & 1 & 0 end{pmatrix}$
$ker(A-2I) = left{ begin{pmatrix} 0 \ y \ 0 \ t end{pmatrix} | y,t in mathbb{R} right} = Span left{ begin{pmatrix} 0 \ 1 \ 0 \ 0 end{pmatrix},begin{pmatrix} 0\0\0\1 end{pmatrix} right}$
Therefore $g_2=2$. So there are 2 Jordan blocks, and they must both be of size $1$.
I put this in Jordan Normal form and got:
$\$ $\$
$J = begin{pmatrix} 3 & 1 & 0 & 0 \ 0 & 3 & 0 & 0 \ 0 & 0 & 2 & 0 \ 0 & 0 & 0 & 2
end{pmatrix}$
How do I find an invertible matrix $P$ such that $P^{-1}AP=J$?
I found the Jordan Normal Form using geometric and algebraic multiplicities along with the minimum polynomial, if that helps at all!
Thanks!
linear-algebra matrices jordan-normal-form
$endgroup$
1
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07
add a comment |
$begingroup$
I have a linear algebra exam tomorrow and this is a frequent question.
$A=
begin{pmatrix} 4 & 0 & 1 & 0 \ 2 & 2 & 3 & 0 \ -1 & 0 & 2 & 0 \ 4 & 0 & 1 & 2
end{pmatrix}$
$C_T(x)=(x-2)^2(x-3)^2$ so eigenvalues are $2$ and $3$.
$a_2=2$ so the sum of the sizes of Jordan blocks for eigenvalue $2$ equals $2=1+1$
$a_3=2$ so the sum of the sizes of Jordan blocks for eigenvalue $3$ equals $2=1+1$
$\$
$g_2=dimE_2=dim ker (A-2I)$
$A-2I = begin{pmatrix} 2 & 0 & 1 & 0 \ 2 & 0 & 3 & 0 \ -1 & 0 & 0 & 0 \ 4 & 0 & 1 & 0 end{pmatrix}$
$ker(A-2I) = left{ begin{pmatrix} 0 \ y \ 0 \ t end{pmatrix} | y,t in mathbb{R} right} = Span left{ begin{pmatrix} 0 \ 1 \ 0 \ 0 end{pmatrix},begin{pmatrix} 0\0\0\1 end{pmatrix} right}$
Therefore $g_2=2$. So there are 2 Jordan blocks, and they must both be of size $1$.
I put this in Jordan Normal form and got:
$\$ $\$
$J = begin{pmatrix} 3 & 1 & 0 & 0 \ 0 & 3 & 0 & 0 \ 0 & 0 & 2 & 0 \ 0 & 0 & 0 & 2
end{pmatrix}$
How do I find an invertible matrix $P$ such that $P^{-1}AP=J$?
I found the Jordan Normal Form using geometric and algebraic multiplicities along with the minimum polynomial, if that helps at all!
Thanks!
linear-algebra matrices jordan-normal-form
$endgroup$
I have a linear algebra exam tomorrow and this is a frequent question.
$A=
begin{pmatrix} 4 & 0 & 1 & 0 \ 2 & 2 & 3 & 0 \ -1 & 0 & 2 & 0 \ 4 & 0 & 1 & 2
end{pmatrix}$
$C_T(x)=(x-2)^2(x-3)^2$ so eigenvalues are $2$ and $3$.
$a_2=2$ so the sum of the sizes of Jordan blocks for eigenvalue $2$ equals $2=1+1$
$a_3=2$ so the sum of the sizes of Jordan blocks for eigenvalue $3$ equals $2=1+1$
$\$
$g_2=dimE_2=dim ker (A-2I)$
$A-2I = begin{pmatrix} 2 & 0 & 1 & 0 \ 2 & 0 & 3 & 0 \ -1 & 0 & 0 & 0 \ 4 & 0 & 1 & 0 end{pmatrix}$
$ker(A-2I) = left{ begin{pmatrix} 0 \ y \ 0 \ t end{pmatrix} | y,t in mathbb{R} right} = Span left{ begin{pmatrix} 0 \ 1 \ 0 \ 0 end{pmatrix},begin{pmatrix} 0\0\0\1 end{pmatrix} right}$
Therefore $g_2=2$. So there are 2 Jordan blocks, and they must both be of size $1$.
I put this in Jordan Normal form and got:
$\$ $\$
$J = begin{pmatrix} 3 & 1 & 0 & 0 \ 0 & 3 & 0 & 0 \ 0 & 0 & 2 & 0 \ 0 & 0 & 0 & 2
end{pmatrix}$
How do I find an invertible matrix $P$ such that $P^{-1}AP=J$?
I found the Jordan Normal Form using geometric and algebraic multiplicities along with the minimum polynomial, if that helps at all!
Thanks!
linear-algebra matrices jordan-normal-form
linear-algebra matrices jordan-normal-form
edited Dec 7 '18 at 17:58
Brad Scott
asked Dec 7 '18 at 17:35
Brad ScottBrad Scott
1548
1548
1
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07
add a comment |
1
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07
1
1
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The columns of $P$ are a basis of $mathbb R^n$ consisting of the eigenvectors and generalized eigenvectors of $A$.
When you don't have nontrivial blocks, things are straightforward. For example, the eigenvectors of $lambda=2$ in your case are just a basis of the null space of $A-2I$.
In general, when you have a complicated block structure for some eigenvalue $lambda$, its eigenvalues and eigenvectors are divided up into "chains" that might look something like
$$
0 xleftarrow{A-lambda I} v_1 xleftarrow{A-lambda I} v_2 xleftarrow{A-lambda I} v_3 \
0 xleftarrow{A-lambda I} v_4 xleftarrow{A-lambda I} v_5 phantom{xleftarrow{A-lambda I} v_n} \
0 xleftarrow{A-lambda I} v_6 xleftarrow{A-lambda I} v_7 phantom{xleftarrow{A-lambda I} v_n} \
$$
Here, $v_1, v_4, v_6$ are eigenvectors and the rest are generalized eigenvectors. The block structure for $lambda$ consist of a block of size $3$ and two blocks of size $2$. The vectors $v_1, v_2, dots, v_7$ are going to be the columns of $P$ that we want to find.
It would be a mistake to try to find $v_1, v_4, v_6$ first because most choices of basis for the null space of $A - lambda I$ don't extend to full-length chains. For example, jf we accidentally chose the basis consisting of $v_1 + v_4, v_4, v_6$, none of them would be part of a chain of length $3$.
So instead, we go from the other end: we try to find $v_3$ first. To do this, we find a basis for the null space of $(A-lambda I)^3$ (which is the same as the subspace spanned by the yet-unknown ${v_1, v_2, dots, v_7}$), then let $v_3$ be a vector in that basis which does not go to $0$ when you multiply it by $(A-lambda I)^2$. Then let $v_2 = (A-lambda I)v_3$ and $v_1 = (A-lambda I)v_2$.
Next, we do the same thing to find where the other two chains begin. We find a basis for the null space of $(A-lambda I)^2$ extending the partial basis ${v_1, v_2}$, and let $v_5$ and $v_7$ be two more vectors in that basis which do not go to $0$ when multiplied by $A-lambda I$. Then $v_4$ can be $(A-lambda I)v_5$ and $v_6 = (A-lambda I)v_7$.
In this case, we're done, because we've found all of $v_1, v_2, dots, v_7$ (which then become the corresponding columns of $P$). In general, we would repeat this process with smaller and smaller powers of $(A-lambda I)$ until we've run out of chains.
To do all this, we already have to know the block structure, so that we know how many chains of each length we're looking for.
In your example, this process is much less onerous; $lambda=3$ only has one chain of length $2$, so we find a basis for the null space of $(A-3I)^2$, let $v_2$ be any vector which is not in the null space of $A-3I$, and let $v_1 = (A-3I)v_2$.
$endgroup$
add a comment |
$begingroup$
The left two columns are just a basis of 2 eigenvectors. For 3, we take the far right vector as some $w$ such that $(A-3I)^2 w = 0 $ but $(A-3I) w neq 0. $ Then the third column is $v = (A - 3I)w.$
$$
P =
left(
begin{array}{rrrr}
0 & 0 & 1&1 \
1&0&-1&3 \
0& 0 & -1& 0 \
0&1&3&1
end{array}
right)
$$
determinant is $-1$ and
$$
P^{-1} =
left(
begin{array}{rrrr}
-3 & 1 & -4&0 \
-1&0&2&1 \
0& 0 & -1& 0 \
1&0&1&0
end{array}
right)
$$
$endgroup$
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3030169%2fput-a-matrix-a-in-jordan-normal-form-and-find-a-p-such-that-p-1ap-j%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The columns of $P$ are a basis of $mathbb R^n$ consisting of the eigenvectors and generalized eigenvectors of $A$.
When you don't have nontrivial blocks, things are straightforward. For example, the eigenvectors of $lambda=2$ in your case are just a basis of the null space of $A-2I$.
In general, when you have a complicated block structure for some eigenvalue $lambda$, its eigenvalues and eigenvectors are divided up into "chains" that might look something like
$$
0 xleftarrow{A-lambda I} v_1 xleftarrow{A-lambda I} v_2 xleftarrow{A-lambda I} v_3 \
0 xleftarrow{A-lambda I} v_4 xleftarrow{A-lambda I} v_5 phantom{xleftarrow{A-lambda I} v_n} \
0 xleftarrow{A-lambda I} v_6 xleftarrow{A-lambda I} v_7 phantom{xleftarrow{A-lambda I} v_n} \
$$
Here, $v_1, v_4, v_6$ are eigenvectors and the rest are generalized eigenvectors. The block structure for $lambda$ consist of a block of size $3$ and two blocks of size $2$. The vectors $v_1, v_2, dots, v_7$ are going to be the columns of $P$ that we want to find.
It would be a mistake to try to find $v_1, v_4, v_6$ first because most choices of basis for the null space of $A - lambda I$ don't extend to full-length chains. For example, jf we accidentally chose the basis consisting of $v_1 + v_4, v_4, v_6$, none of them would be part of a chain of length $3$.
So instead, we go from the other end: we try to find $v_3$ first. To do this, we find a basis for the null space of $(A-lambda I)^3$ (which is the same as the subspace spanned by the yet-unknown ${v_1, v_2, dots, v_7}$), then let $v_3$ be a vector in that basis which does not go to $0$ when you multiply it by $(A-lambda I)^2$. Then let $v_2 = (A-lambda I)v_3$ and $v_1 = (A-lambda I)v_2$.
Next, we do the same thing to find where the other two chains begin. We find a basis for the null space of $(A-lambda I)^2$ extending the partial basis ${v_1, v_2}$, and let $v_5$ and $v_7$ be two more vectors in that basis which do not go to $0$ when multiplied by $A-lambda I$. Then $v_4$ can be $(A-lambda I)v_5$ and $v_6 = (A-lambda I)v_7$.
In this case, we're done, because we've found all of $v_1, v_2, dots, v_7$ (which then become the corresponding columns of $P$). In general, we would repeat this process with smaller and smaller powers of $(A-lambda I)$ until we've run out of chains.
To do all this, we already have to know the block structure, so that we know how many chains of each length we're looking for.
In your example, this process is much less onerous; $lambda=3$ only has one chain of length $2$, so we find a basis for the null space of $(A-3I)^2$, let $v_2$ be any vector which is not in the null space of $A-3I$, and let $v_1 = (A-3I)v_2$.
$endgroup$
add a comment |
$begingroup$
The columns of $P$ are a basis of $mathbb R^n$ consisting of the eigenvectors and generalized eigenvectors of $A$.
When you don't have nontrivial blocks, things are straightforward. For example, the eigenvectors of $lambda=2$ in your case are just a basis of the null space of $A-2I$.
In general, when you have a complicated block structure for some eigenvalue $lambda$, its eigenvalues and eigenvectors are divided up into "chains" that might look something like
$$
0 xleftarrow{A-lambda I} v_1 xleftarrow{A-lambda I} v_2 xleftarrow{A-lambda I} v_3 \
0 xleftarrow{A-lambda I} v_4 xleftarrow{A-lambda I} v_5 phantom{xleftarrow{A-lambda I} v_n} \
0 xleftarrow{A-lambda I} v_6 xleftarrow{A-lambda I} v_7 phantom{xleftarrow{A-lambda I} v_n} \
$$
Here, $v_1, v_4, v_6$ are eigenvectors and the rest are generalized eigenvectors. The block structure for $lambda$ consist of a block of size $3$ and two blocks of size $2$. The vectors $v_1, v_2, dots, v_7$ are going to be the columns of $P$ that we want to find.
It would be a mistake to try to find $v_1, v_4, v_6$ first because most choices of basis for the null space of $A - lambda I$ don't extend to full-length chains. For example, jf we accidentally chose the basis consisting of $v_1 + v_4, v_4, v_6$, none of them would be part of a chain of length $3$.
So instead, we go from the other end: we try to find $v_3$ first. To do this, we find a basis for the null space of $(A-lambda I)^3$ (which is the same as the subspace spanned by the yet-unknown ${v_1, v_2, dots, v_7}$), then let $v_3$ be a vector in that basis which does not go to $0$ when you multiply it by $(A-lambda I)^2$. Then let $v_2 = (A-lambda I)v_3$ and $v_1 = (A-lambda I)v_2$.
Next, we do the same thing to find where the other two chains begin. We find a basis for the null space of $(A-lambda I)^2$ extending the partial basis ${v_1, v_2}$, and let $v_5$ and $v_7$ be two more vectors in that basis which do not go to $0$ when multiplied by $A-lambda I$. Then $v_4$ can be $(A-lambda I)v_5$ and $v_6 = (A-lambda I)v_7$.
In this case, we're done, because we've found all of $v_1, v_2, dots, v_7$ (which then become the corresponding columns of $P$). In general, we would repeat this process with smaller and smaller powers of $(A-lambda I)$ until we've run out of chains.
To do all this, we already have to know the block structure, so that we know how many chains of each length we're looking for.
In your example, this process is much less onerous; $lambda=3$ only has one chain of length $2$, so we find a basis for the null space of $(A-3I)^2$, let $v_2$ be any vector which is not in the null space of $A-3I$, and let $v_1 = (A-3I)v_2$.
$endgroup$
add a comment |
$begingroup$
The columns of $P$ are a basis of $mathbb R^n$ consisting of the eigenvectors and generalized eigenvectors of $A$.
When you don't have nontrivial blocks, things are straightforward. For example, the eigenvectors of $lambda=2$ in your case are just a basis of the null space of $A-2I$.
In general, when you have a complicated block structure for some eigenvalue $lambda$, its eigenvalues and eigenvectors are divided up into "chains" that might look something like
$$
0 xleftarrow{A-lambda I} v_1 xleftarrow{A-lambda I} v_2 xleftarrow{A-lambda I} v_3 \
0 xleftarrow{A-lambda I} v_4 xleftarrow{A-lambda I} v_5 phantom{xleftarrow{A-lambda I} v_n} \
0 xleftarrow{A-lambda I} v_6 xleftarrow{A-lambda I} v_7 phantom{xleftarrow{A-lambda I} v_n} \
$$
Here, $v_1, v_4, v_6$ are eigenvectors and the rest are generalized eigenvectors. The block structure for $lambda$ consist of a block of size $3$ and two blocks of size $2$. The vectors $v_1, v_2, dots, v_7$ are going to be the columns of $P$ that we want to find.
It would be a mistake to try to find $v_1, v_4, v_6$ first because most choices of basis for the null space of $A - lambda I$ don't extend to full-length chains. For example, jf we accidentally chose the basis consisting of $v_1 + v_4, v_4, v_6$, none of them would be part of a chain of length $3$.
So instead, we go from the other end: we try to find $v_3$ first. To do this, we find a basis for the null space of $(A-lambda I)^3$ (which is the same as the subspace spanned by the yet-unknown ${v_1, v_2, dots, v_7}$), then let $v_3$ be a vector in that basis which does not go to $0$ when you multiply it by $(A-lambda I)^2$. Then let $v_2 = (A-lambda I)v_3$ and $v_1 = (A-lambda I)v_2$.
Next, we do the same thing to find where the other two chains begin. We find a basis for the null space of $(A-lambda I)^2$ extending the partial basis ${v_1, v_2}$, and let $v_5$ and $v_7$ be two more vectors in that basis which do not go to $0$ when multiplied by $A-lambda I$. Then $v_4$ can be $(A-lambda I)v_5$ and $v_6 = (A-lambda I)v_7$.
In this case, we're done, because we've found all of $v_1, v_2, dots, v_7$ (which then become the corresponding columns of $P$). In general, we would repeat this process with smaller and smaller powers of $(A-lambda I)$ until we've run out of chains.
To do all this, we already have to know the block structure, so that we know how many chains of each length we're looking for.
In your example, this process is much less onerous; $lambda=3$ only has one chain of length $2$, so we find a basis for the null space of $(A-3I)^2$, let $v_2$ be any vector which is not in the null space of $A-3I$, and let $v_1 = (A-3I)v_2$.
$endgroup$
The columns of $P$ are a basis of $mathbb R^n$ consisting of the eigenvectors and generalized eigenvectors of $A$.
When you don't have nontrivial blocks, things are straightforward. For example, the eigenvectors of $lambda=2$ in your case are just a basis of the null space of $A-2I$.
In general, when you have a complicated block structure for some eigenvalue $lambda$, its eigenvalues and eigenvectors are divided up into "chains" that might look something like
$$
0 xleftarrow{A-lambda I} v_1 xleftarrow{A-lambda I} v_2 xleftarrow{A-lambda I} v_3 \
0 xleftarrow{A-lambda I} v_4 xleftarrow{A-lambda I} v_5 phantom{xleftarrow{A-lambda I} v_n} \
0 xleftarrow{A-lambda I} v_6 xleftarrow{A-lambda I} v_7 phantom{xleftarrow{A-lambda I} v_n} \
$$
Here, $v_1, v_4, v_6$ are eigenvectors and the rest are generalized eigenvectors. The block structure for $lambda$ consist of a block of size $3$ and two blocks of size $2$. The vectors $v_1, v_2, dots, v_7$ are going to be the columns of $P$ that we want to find.
It would be a mistake to try to find $v_1, v_4, v_6$ first because most choices of basis for the null space of $A - lambda I$ don't extend to full-length chains. For example, jf we accidentally chose the basis consisting of $v_1 + v_4, v_4, v_6$, none of them would be part of a chain of length $3$.
So instead, we go from the other end: we try to find $v_3$ first. To do this, we find a basis for the null space of $(A-lambda I)^3$ (which is the same as the subspace spanned by the yet-unknown ${v_1, v_2, dots, v_7}$), then let $v_3$ be a vector in that basis which does not go to $0$ when you multiply it by $(A-lambda I)^2$. Then let $v_2 = (A-lambda I)v_3$ and $v_1 = (A-lambda I)v_2$.
Next, we do the same thing to find where the other two chains begin. We find a basis for the null space of $(A-lambda I)^2$ extending the partial basis ${v_1, v_2}$, and let $v_5$ and $v_7$ be two more vectors in that basis which do not go to $0$ when multiplied by $A-lambda I$. Then $v_4$ can be $(A-lambda I)v_5$ and $v_6 = (A-lambda I)v_7$.
In this case, we're done, because we've found all of $v_1, v_2, dots, v_7$ (which then become the corresponding columns of $P$). In general, we would repeat this process with smaller and smaller powers of $(A-lambda I)$ until we've run out of chains.
To do all this, we already have to know the block structure, so that we know how many chains of each length we're looking for.
In your example, this process is much less onerous; $lambda=3$ only has one chain of length $2$, so we find a basis for the null space of $(A-3I)^2$, let $v_2$ be any vector which is not in the null space of $A-3I$, and let $v_1 = (A-3I)v_2$.
answered Dec 7 '18 at 18:11
Misha LavrovMisha Lavrov
45k556107
45k556107
add a comment |
add a comment |
$begingroup$
The left two columns are just a basis of 2 eigenvectors. For 3, we take the far right vector as some $w$ such that $(A-3I)^2 w = 0 $ but $(A-3I) w neq 0. $ Then the third column is $v = (A - 3I)w.$
$$
P =
left(
begin{array}{rrrr}
0 & 0 & 1&1 \
1&0&-1&3 \
0& 0 & -1& 0 \
0&1&3&1
end{array}
right)
$$
determinant is $-1$ and
$$
P^{-1} =
left(
begin{array}{rrrr}
-3 & 1 & -4&0 \
-1&0&2&1 \
0& 0 & -1& 0 \
1&0&1&0
end{array}
right)
$$
$endgroup$
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
add a comment |
$begingroup$
The left two columns are just a basis of 2 eigenvectors. For 3, we take the far right vector as some $w$ such that $(A-3I)^2 w = 0 $ but $(A-3I) w neq 0. $ Then the third column is $v = (A - 3I)w.$
$$
P =
left(
begin{array}{rrrr}
0 & 0 & 1&1 \
1&0&-1&3 \
0& 0 & -1& 0 \
0&1&3&1
end{array}
right)
$$
determinant is $-1$ and
$$
P^{-1} =
left(
begin{array}{rrrr}
-3 & 1 & -4&0 \
-1&0&2&1 \
0& 0 & -1& 0 \
1&0&1&0
end{array}
right)
$$
$endgroup$
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
add a comment |
$begingroup$
The left two columns are just a basis of 2 eigenvectors. For 3, we take the far right vector as some $w$ such that $(A-3I)^2 w = 0 $ but $(A-3I) w neq 0. $ Then the third column is $v = (A - 3I)w.$
$$
P =
left(
begin{array}{rrrr}
0 & 0 & 1&1 \
1&0&-1&3 \
0& 0 & -1& 0 \
0&1&3&1
end{array}
right)
$$
determinant is $-1$ and
$$
P^{-1} =
left(
begin{array}{rrrr}
-3 & 1 & -4&0 \
-1&0&2&1 \
0& 0 & -1& 0 \
1&0&1&0
end{array}
right)
$$
$endgroup$
The left two columns are just a basis of 2 eigenvectors. For 3, we take the far right vector as some $w$ such that $(A-3I)^2 w = 0 $ but $(A-3I) w neq 0. $ Then the third column is $v = (A - 3I)w.$
$$
P =
left(
begin{array}{rrrr}
0 & 0 & 1&1 \
1&0&-1&3 \
0& 0 & -1& 0 \
0&1&3&1
end{array}
right)
$$
determinant is $-1$ and
$$
P^{-1} =
left(
begin{array}{rrrr}
-3 & 1 & -4&0 \
-1&0&2&1 \
0& 0 & -1& 0 \
1&0&1&0
end{array}
right)
$$
edited Dec 7 '18 at 18:05
answered Dec 7 '18 at 17:59
Will JagyWill Jagy
102k5101199
102k5101199
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
add a comment |
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
So how does this work in general? Or for example, say we have a characteristic polynomial of $C_T(x)=(x+1)^4$ for a matrix like $begin{pmatrix} -6&0&1&-5 \ 5&-1&-1&5 \ 0&0&-1&0 \ 5&0&-1&4 end{pmatrix}$ with Jordan normal form $begin{pmatrix} -1&1&0&0 \ 0&-1&0&0 \ 0&0&-1&0 \ 0&0&0&-1 end{pmatrix}$
$endgroup$
– Brad Scott
Dec 7 '18 at 18:06
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
This matrix has minimal polynomial $(x+1)^2.$ There are three genuine eigenvectors. Take any vector that is not an eigenvector, namely $(A+I)w neq 0,$ then $v= (A+I)w.$ Then you need to compare $v,$ which is a genuine eigenvector, and choose eigenvectors $t,u$ so that $t,u,v$ make a basis of genuine eigenvectors. The four columns of $P$ are then $t,u,v,w.$
$endgroup$
– Will Jagy
Dec 7 '18 at 18:16
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@BradScott Finished that one, taking $$ P = left( begin{array}{rrrr} 0 & 1 & 1&0 \ 1&0&-1&0 \ 0& 5 & 0& 1 \ 0&0&-1&0 end{array} right) $$ the extra $1$ appears in the position (3,4) rather than the (1,2) you indictae
$endgroup$
– Will Jagy
Dec 7 '18 at 18:25
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@WillJagy Putting an extra $1$ in position $(3,4)$ or $(1,2)$ or even $(2,3)$ is equivalent; we can go from one of these to another just by permuting the columns of $P$.
$endgroup$
– Misha Lavrov
Dec 7 '18 at 19:38
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
$begingroup$
@MishaLavrov I agree, but i don't think the student asking is sure about such things. I would like it if the students actually went ahead and produced the $P$ and $P^{-1}$ and checked $P^{-1}AP,$ at least when the size is small and all eigenvalues are integers, but mostly I think they do not do that much.
$endgroup$
– Will Jagy
Dec 7 '18 at 20:30
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3030169%2fput-a-matrix-a-in-jordan-normal-form-and-find-a-p-such-that-p-1ap-j%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
actually, the 2 block is diagonal, the 3 block has the extra 1. I'll post an answer
$endgroup$
– Will Jagy
Dec 7 '18 at 17:49
$begingroup$
@WillJagy Ah! I got the geometric and minimum mixed up
$endgroup$
– Brad Scott
Dec 7 '18 at 17:52
$begingroup$
as long as they are giving you matrices with integer eigenvalues, you should find $P$ and $P^{-1}$ and confirm $P^{-1}AP$ every time. Plenty of time later for matrices where the eigenvalues are dreadful.
$endgroup$
– Will Jagy
Dec 7 '18 at 18:07