Simple proof of the following: A matrix $A$ is onto if and only if its rows are linearly independent
up vote
0
down vote
favorite
Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.
Searching for it on Google proved fruitless.
linear-algebra
add a comment |
up vote
0
down vote
favorite
Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.
Searching for it on Google proved fruitless.
linear-algebra
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.
Searching for it on Google proved fruitless.
linear-algebra
Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.
Searching for it on Google proved fruitless.
linear-algebra
linear-algebra
edited Nov 20 at 1:24
Tianlalu
2,594632
2,594632
asked Nov 20 at 1:14
James Shapiro
1113
1113
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49
add a comment |
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49
add a comment |
2 Answers
2
active
oldest
votes
up vote
2
down vote
Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.
Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.
Review of some relevant linear algebra
First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.
Now onto the proof
Proof
Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
add a comment |
up vote
0
down vote
For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.
Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.
Review of some relevant linear algebra
First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.
Now onto the proof
Proof
Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
add a comment |
up vote
2
down vote
Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.
Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.
Review of some relevant linear algebra
First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.
Now onto the proof
Proof
Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
add a comment |
up vote
2
down vote
up vote
2
down vote
Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.
Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.
Review of some relevant linear algebra
First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.
Now onto the proof
Proof
Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.
Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.
Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.
Review of some relevant linear algebra
First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.
Now onto the proof
Proof
Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.
edited 2 days ago
answered Nov 20 at 1:29
jgon
9,76611538
9,76611538
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
add a comment |
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38
add a comment |
up vote
0
down vote
For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
add a comment |
up vote
0
down vote
For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
add a comment |
up vote
0
down vote
up vote
0
down vote
For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.
For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.
answered Nov 20 at 1:23
Joel Pereira
3265
3265
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
add a comment |
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005793%2fsimple-proof-of-the-following-a-matrix-a-is-onto-if-and-only-if-its-rows-are%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35
$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49