Proving rank of $(I_n-{1over n}A_n)$ is $n-1$ where $A_n$ is $ntimes n$ with all entries $1$
up vote
1
down vote
favorite
We were asked to find a symmetric idempotent matrix $H$ with rank $n-1$ such that if $X$ is a column vector with $n$ observations, then ${1over n}X^THX$ is the variance of observations in $X$.
I found the matrix (for $n$ obs) to be $H_n=I_n-{1over n}A_n$ where $I_n$ is identity matrix of dimension $ntimes n$, $A_n$ is again $ntimes n$ with all observations being $1$ and $H_n$ is the required matrix.
It was easy to show this is symmetric and idempotent but I'm facing difficulty with showing its rank is $n-1$.
However, it is easy to see $R_1+R_2+dots+R_n=0$ where $R_i$ is the $i^{th}$ row. So its rank is strictly less than $n$.
I also noticed $R_1+R_2+dots+R_n-R_ine0$ for any $i$.
How should I proceed?
linear-algebra matrices matrix-rank
add a comment |
up vote
1
down vote
favorite
We were asked to find a symmetric idempotent matrix $H$ with rank $n-1$ such that if $X$ is a column vector with $n$ observations, then ${1over n}X^THX$ is the variance of observations in $X$.
I found the matrix (for $n$ obs) to be $H_n=I_n-{1over n}A_n$ where $I_n$ is identity matrix of dimension $ntimes n$, $A_n$ is again $ntimes n$ with all observations being $1$ and $H_n$ is the required matrix.
It was easy to show this is symmetric and idempotent but I'm facing difficulty with showing its rank is $n-1$.
However, it is easy to see $R_1+R_2+dots+R_n=0$ where $R_i$ is the $i^{th}$ row. So its rank is strictly less than $n$.
I also noticed $R_1+R_2+dots+R_n-R_ine0$ for any $i$.
How should I proceed?
linear-algebra matrices matrix-rank
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
2
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
We were asked to find a symmetric idempotent matrix $H$ with rank $n-1$ such that if $X$ is a column vector with $n$ observations, then ${1over n}X^THX$ is the variance of observations in $X$.
I found the matrix (for $n$ obs) to be $H_n=I_n-{1over n}A_n$ where $I_n$ is identity matrix of dimension $ntimes n$, $A_n$ is again $ntimes n$ with all observations being $1$ and $H_n$ is the required matrix.
It was easy to show this is symmetric and idempotent but I'm facing difficulty with showing its rank is $n-1$.
However, it is easy to see $R_1+R_2+dots+R_n=0$ where $R_i$ is the $i^{th}$ row. So its rank is strictly less than $n$.
I also noticed $R_1+R_2+dots+R_n-R_ine0$ for any $i$.
How should I proceed?
linear-algebra matrices matrix-rank
We were asked to find a symmetric idempotent matrix $H$ with rank $n-1$ such that if $X$ is a column vector with $n$ observations, then ${1over n}X^THX$ is the variance of observations in $X$.
I found the matrix (for $n$ obs) to be $H_n=I_n-{1over n}A_n$ where $I_n$ is identity matrix of dimension $ntimes n$, $A_n$ is again $ntimes n$ with all observations being $1$ and $H_n$ is the required matrix.
It was easy to show this is symmetric and idempotent but I'm facing difficulty with showing its rank is $n-1$.
However, it is easy to see $R_1+R_2+dots+R_n=0$ where $R_i$ is the $i^{th}$ row. So its rank is strictly less than $n$.
I also noticed $R_1+R_2+dots+R_n-R_ine0$ for any $i$.
How should I proceed?
linear-algebra matrices matrix-rank
linear-algebra matrices matrix-rank
edited 2 days ago
StubbornAtom
4,86911137
4,86911137
asked Nov 20 at 10:05
Anvit
1,510419
1,510419
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
2
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago
add a comment |
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
2
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
2
2
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago
add a comment |
3 Answers
3
active
oldest
votes
up vote
3
down vote
accepted
Denoting the column vector of all $1$s by $mathbf1$, we have$$H=I_n-frac{1}{n}mathbf{11}^top$$
Indeed as you say, $H$ is an idempotent matrix. Then we know that $$mathrm{rank}(H)=mathrm{trace}(H)=mathrm{trace}(I_n)-mathrm{trace}left(frac{1}{n}mathbf{11}^topright)=n-1$$
We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:
We know that for any two matrices $A$ and $B$ having the same order, $$mathrm{rank}(A-B+B)le mathrm{rank}(A-B)+mathrm{rank}(B)$$
Or, $$mathrm{rank}(A-B)ge |mathrm{rank}(A)-mathrm{rank}(B)|$$
Noting that $mathbf{11}^top$ is a rank $1$ matrix, applying this inequality on $H$ we get,
$$mathrm{rank}(H)ge n-1$$
Now we can show that $mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$det(H)=1-frac{1}{n}mathbf1^topmathbf1=1-1=0$$
So it must be that $$mathrm{rank}(H)=n-1$$
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
add a comment |
up vote
1
down vote
To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as
begin{align*}
(S)left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
x_1 &+& x_2 &+& ldots &x_n &=& n x_2 \
vdots && &&&vdots && vdots \
x_1 &+& x_2 &+& ldots &x_n &=& n x_n \
end{matrix}right.
end{align*}
now substract the first line to all the other lines :
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&&&& 0 &=& n (x_2-x_1) \
&&&&&vdots&& vdots \
&&&&& 0 &=& n (x_n-x_1) \
end{matrix}right.\
\
& Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&x_2&&& &=& x_1 \
&&&ddots&&&& vdots \
&&&&& x_n &=& x_1 \
end{matrix}right.
end{align*}
Next we subtract all lines $2, ldots, n$ to line $1$ to get
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_2&& &=& x_1 \
&ddots &&& vdots \
&&x_n &=& x_1 \
end{matrix}right.
end{align*}
whose solutions are the $1$-dimensional space generated by $begin{pmatrix}1 \ 1 \ vdots \ 1end{pmatrix}$.
add a comment |
up vote
0
down vote
Let $e_k = (1, omega^k, omega^{2k}, ldots, omega^{(n-1)k})$ where $omega=e^{2pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
Denoting the column vector of all $1$s by $mathbf1$, we have$$H=I_n-frac{1}{n}mathbf{11}^top$$
Indeed as you say, $H$ is an idempotent matrix. Then we know that $$mathrm{rank}(H)=mathrm{trace}(H)=mathrm{trace}(I_n)-mathrm{trace}left(frac{1}{n}mathbf{11}^topright)=n-1$$
We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:
We know that for any two matrices $A$ and $B$ having the same order, $$mathrm{rank}(A-B+B)le mathrm{rank}(A-B)+mathrm{rank}(B)$$
Or, $$mathrm{rank}(A-B)ge |mathrm{rank}(A)-mathrm{rank}(B)|$$
Noting that $mathbf{11}^top$ is a rank $1$ matrix, applying this inequality on $H$ we get,
$$mathrm{rank}(H)ge n-1$$
Now we can show that $mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$det(H)=1-frac{1}{n}mathbf1^topmathbf1=1-1=0$$
So it must be that $$mathrm{rank}(H)=n-1$$
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
add a comment |
up vote
3
down vote
accepted
Denoting the column vector of all $1$s by $mathbf1$, we have$$H=I_n-frac{1}{n}mathbf{11}^top$$
Indeed as you say, $H$ is an idempotent matrix. Then we know that $$mathrm{rank}(H)=mathrm{trace}(H)=mathrm{trace}(I_n)-mathrm{trace}left(frac{1}{n}mathbf{11}^topright)=n-1$$
We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:
We know that for any two matrices $A$ and $B$ having the same order, $$mathrm{rank}(A-B+B)le mathrm{rank}(A-B)+mathrm{rank}(B)$$
Or, $$mathrm{rank}(A-B)ge |mathrm{rank}(A)-mathrm{rank}(B)|$$
Noting that $mathbf{11}^top$ is a rank $1$ matrix, applying this inequality on $H$ we get,
$$mathrm{rank}(H)ge n-1$$
Now we can show that $mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$det(H)=1-frac{1}{n}mathbf1^topmathbf1=1-1=0$$
So it must be that $$mathrm{rank}(H)=n-1$$
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
add a comment |
up vote
3
down vote
accepted
up vote
3
down vote
accepted
Denoting the column vector of all $1$s by $mathbf1$, we have$$H=I_n-frac{1}{n}mathbf{11}^top$$
Indeed as you say, $H$ is an idempotent matrix. Then we know that $$mathrm{rank}(H)=mathrm{trace}(H)=mathrm{trace}(I_n)-mathrm{trace}left(frac{1}{n}mathbf{11}^topright)=n-1$$
We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:
We know that for any two matrices $A$ and $B$ having the same order, $$mathrm{rank}(A-B+B)le mathrm{rank}(A-B)+mathrm{rank}(B)$$
Or, $$mathrm{rank}(A-B)ge |mathrm{rank}(A)-mathrm{rank}(B)|$$
Noting that $mathbf{11}^top$ is a rank $1$ matrix, applying this inequality on $H$ we get,
$$mathrm{rank}(H)ge n-1$$
Now we can show that $mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$det(H)=1-frac{1}{n}mathbf1^topmathbf1=1-1=0$$
So it must be that $$mathrm{rank}(H)=n-1$$
Denoting the column vector of all $1$s by $mathbf1$, we have$$H=I_n-frac{1}{n}mathbf{11}^top$$
Indeed as you say, $H$ is an idempotent matrix. Then we know that $$mathrm{rank}(H)=mathrm{trace}(H)=mathrm{trace}(I_n)-mathrm{trace}left(frac{1}{n}mathbf{11}^topright)=n-1$$
We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:
We know that for any two matrices $A$ and $B$ having the same order, $$mathrm{rank}(A-B+B)le mathrm{rank}(A-B)+mathrm{rank}(B)$$
Or, $$mathrm{rank}(A-B)ge |mathrm{rank}(A)-mathrm{rank}(B)|$$
Noting that $mathbf{11}^top$ is a rank $1$ matrix, applying this inequality on $H$ we get,
$$mathrm{rank}(H)ge n-1$$
Now we can show that $mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$det(H)=1-frac{1}{n}mathbf1^topmathbf1=1-1=0$$
So it must be that $$mathrm{rank}(H)=n-1$$
answered Nov 20 at 11:40
StubbornAtom
4,86911137
4,86911137
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
add a comment |
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
Both of them are really nice and elegent solutions. Thank you!
– Anvit
2 days ago
add a comment |
up vote
1
down vote
To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as
begin{align*}
(S)left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
x_1 &+& x_2 &+& ldots &x_n &=& n x_2 \
vdots && &&&vdots && vdots \
x_1 &+& x_2 &+& ldots &x_n &=& n x_n \
end{matrix}right.
end{align*}
now substract the first line to all the other lines :
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&&&& 0 &=& n (x_2-x_1) \
&&&&&vdots&& vdots \
&&&&& 0 &=& n (x_n-x_1) \
end{matrix}right.\
\
& Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&x_2&&& &=& x_1 \
&&&ddots&&&& vdots \
&&&&& x_n &=& x_1 \
end{matrix}right.
end{align*}
Next we subtract all lines $2, ldots, n$ to line $1$ to get
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_2&& &=& x_1 \
&ddots &&& vdots \
&&x_n &=& x_1 \
end{matrix}right.
end{align*}
whose solutions are the $1$-dimensional space generated by $begin{pmatrix}1 \ 1 \ vdots \ 1end{pmatrix}$.
add a comment |
up vote
1
down vote
To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as
begin{align*}
(S)left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
x_1 &+& x_2 &+& ldots &x_n &=& n x_2 \
vdots && &&&vdots && vdots \
x_1 &+& x_2 &+& ldots &x_n &=& n x_n \
end{matrix}right.
end{align*}
now substract the first line to all the other lines :
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&&&& 0 &=& n (x_2-x_1) \
&&&&&vdots&& vdots \
&&&&& 0 &=& n (x_n-x_1) \
end{matrix}right.\
\
& Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&x_2&&& &=& x_1 \
&&&ddots&&&& vdots \
&&&&& x_n &=& x_1 \
end{matrix}right.
end{align*}
Next we subtract all lines $2, ldots, n$ to line $1$ to get
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_2&& &=& x_1 \
&ddots &&& vdots \
&&x_n &=& x_1 \
end{matrix}right.
end{align*}
whose solutions are the $1$-dimensional space generated by $begin{pmatrix}1 \ 1 \ vdots \ 1end{pmatrix}$.
add a comment |
up vote
1
down vote
up vote
1
down vote
To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as
begin{align*}
(S)left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
x_1 &+& x_2 &+& ldots &x_n &=& n x_2 \
vdots && &&&vdots && vdots \
x_1 &+& x_2 &+& ldots &x_n &=& n x_n \
end{matrix}right.
end{align*}
now substract the first line to all the other lines :
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&&&& 0 &=& n (x_2-x_1) \
&&&&&vdots&& vdots \
&&&&& 0 &=& n (x_n-x_1) \
end{matrix}right.\
\
& Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&x_2&&& &=& x_1 \
&&&ddots&&&& vdots \
&&&&& x_n &=& x_1 \
end{matrix}right.
end{align*}
Next we subtract all lines $2, ldots, n$ to line $1$ to get
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_2&& &=& x_1 \
&ddots &&& vdots \
&&x_n &=& x_1 \
end{matrix}right.
end{align*}
whose solutions are the $1$-dimensional space generated by $begin{pmatrix}1 \ 1 \ vdots \ 1end{pmatrix}$.
To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as
begin{align*}
(S)left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
x_1 &+& x_2 &+& ldots &x_n &=& n x_2 \
vdots && &&&vdots && vdots \
x_1 &+& x_2 &+& ldots &x_n &=& n x_n \
end{matrix}right.
end{align*}
now substract the first line to all the other lines :
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&&&& 0 &=& n (x_2-x_1) \
&&&&&vdots&& vdots \
&&&&& 0 &=& n (x_n-x_1) \
end{matrix}right.\
\
& Longleftrightarrow &
left{begin{matrix}
x_1 &+& x_2 &+& ldots &x_n &=& n x_1 \
&&x_2&&& &=& x_1 \
&&&ddots&&&& vdots \
&&&&& x_n &=& x_1 \
end{matrix}right.
end{align*}
Next we subtract all lines $2, ldots, n$ to line $1$ to get
begin{align*}
(S) &Longleftrightarrow &
left{begin{matrix}
x_2&& &=& x_1 \
&ddots &&& vdots \
&&x_n &=& x_1 \
end{matrix}right.
end{align*}
whose solutions are the $1$-dimensional space generated by $begin{pmatrix}1 \ 1 \ vdots \ 1end{pmatrix}$.
edited Nov 20 at 12:10
answered Nov 20 at 12:04
Joel Cohen
7,16912037
7,16912037
add a comment |
add a comment |
up vote
0
down vote
Let $e_k = (1, omega^k, omega^{2k}, ldots, omega^{(n-1)k})$ where $omega=e^{2pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
add a comment |
up vote
0
down vote
Let $e_k = (1, omega^k, omega^{2k}, ldots, omega^{(n-1)k})$ where $omega=e^{2pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
add a comment |
up vote
0
down vote
up vote
0
down vote
Let $e_k = (1, omega^k, omega^{2k}, ldots, omega^{(n-1)k})$ where $omega=e^{2pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.
Let $e_k = (1, omega^k, omega^{2k}, ldots, omega^{(n-1)k})$ where $omega=e^{2pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.
answered Nov 20 at 10:14
Richard Martin
1,4618
1,4618
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
add a comment |
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors
– Anvit
Nov 20 at 10:15
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Go and study them then.
– Richard Martin
Nov 20 at 10:18
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
Please tell me if me reasoning is correct. ${e_1,e_2,dots,e_n}$ for a basis for vector space $V=mathbb C^n$ and $H$ is linear transformation $Vto V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant
– Anvit
Nov 20 at 10:47
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above.
– Richard Martin
Nov 20 at 10:52
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3006137%2fproving-rank-of-i-n-1-over-na-n-is-n-1-where-a-n-is-n-times-n-with-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable.
– Ramiro Scorolli
Nov 20 at 10:22
Sorry, haven't studied eigenvectors yet. Is there an approach without them?
– Anvit
Nov 20 at 10:25
2
It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $mathbb{K}$, $ker(E)oplustext{im}(E)=mathbb{K}^n$. (That means $ker(E)+text{im}(E)=mathbb{K}^n$ and $ker(E)captext{im}(E)={0}$.) So, if you find out that $ker(E)$ is $r$-dimensional, then $text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $ker(H)$ has dimension $1$.
– Batominovski
Nov 20 at 10:53
@Batominovski $ker(H)={(x,x,dots)|xinmathbb K}$ I think this proves it. Thanks
– Anvit
2 days ago