Simple proof of the following: A matrix $A$ is onto if and only if its rows are linearly independent











up vote
0
down vote

favorite
1












Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.



Searching for it on Google proved fruitless.










share|cite|improve this question
























  • What do you mean by "matrix $A$ is onto"?
    – Somos
    Nov 20 at 1:35












  • $$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
    – James Shapiro
    Nov 20 at 1:49

















up vote
0
down vote

favorite
1












Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.



Searching for it on Google proved fruitless.










share|cite|improve this question
























  • What do you mean by "matrix $A$ is onto"?
    – Somos
    Nov 20 at 1:35












  • $$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
    – James Shapiro
    Nov 20 at 1:49















up vote
0
down vote

favorite
1









up vote
0
down vote

favorite
1






1





Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.



Searching for it on Google proved fruitless.










share|cite|improve this question















Saw this claim in this video https://www.youtube.com/watch?v=cBSf2pGYAcA&list=PL06960BA52D0DB32B&index=4 at 23m35s and struggled in vain to prove it.



Searching for it on Google proved fruitless.







linear-algebra






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 20 at 1:24









Tianlalu

2,594632




2,594632










asked Nov 20 at 1:14









James Shapiro

1113




1113












  • What do you mean by "matrix $A$ is onto"?
    – Somos
    Nov 20 at 1:35












  • $$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
    – James Shapiro
    Nov 20 at 1:49




















  • What do you mean by "matrix $A$ is onto"?
    – Somos
    Nov 20 at 1:35












  • $$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
    – James Shapiro
    Nov 20 at 1:49


















What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35






What do you mean by "matrix $A$ is onto"?
– Somos
Nov 20 at 1:35














$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49






$$forall y in mathbb{R}^m, exists x textrm{such that} y = Ax$$
– James Shapiro
Nov 20 at 1:49












2 Answers
2






active

oldest

votes

















up vote
2
down vote













Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.



Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.



Review of some relevant linear algebra



First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.



Now onto the proof



Proof



Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.






share|cite|improve this answer























  • I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
    – James Shapiro
    Nov 20 at 1:35










  • @JamesShapiro That's also a reasonable direction of proof.
    – jgon
    Nov 20 at 1:38


















up vote
0
down vote













For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.






share|cite|improve this answer





















  • Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
    – James Shapiro
    Nov 20 at 1:24













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005793%2fsimple-proof-of-the-following-a-matrix-a-is-onto-if-and-only-if-its-rows-are%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
2
down vote













Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.



Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.



Review of some relevant linear algebra



First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.



Now onto the proof



Proof



Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.






share|cite|improve this answer























  • I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
    – James Shapiro
    Nov 20 at 1:35










  • @JamesShapiro That's also a reasonable direction of proof.
    – jgon
    Nov 20 at 1:38















up vote
2
down vote













Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.



Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.



Review of some relevant linear algebra



First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.



Now onto the proof



Proof



Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.






share|cite|improve this answer























  • I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
    – James Shapiro
    Nov 20 at 1:35










  • @JamesShapiro That's also a reasonable direction of proof.
    – jgon
    Nov 20 at 1:38













up vote
2
down vote










up vote
2
down vote









Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.



Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.



Review of some relevant linear algebra



First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.



Now onto the proof



Proof



Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.






share|cite|improve this answer














Here's a simple proof. Let $V$ and $W$ be vector spaces. Let $T:Vto W$ be a linear map. Let $A$ be any matrix representing this map. Then $newcommandrk{operatorname{rank}}rk A$ is the dimension of the image of $T$, and the number of rows of $A$ is the dimension of $W$. Thus for $T$ to be surjective, we must have that the rank of $A$ is equal to the number of rows of $A$. Since row rank equals column rank, the rank of $A$ is the number of linearly independent rows of $A$. Hence $T$ is surjective if and only if the rows of any matrix representing it are linearly independent.



Edit I realized there a perhaps slightly more direct proof, which sort of mixes the proof above with the proof that row rank equals column rank.



Review of some relevant linear algebra



First recall some notions. If $V$ is a vector space over a field $K$, then the dual vector space to $V$ is defined to be $V^*:=newcommandHom{operatorname{Hom}}Hom_K(V,K)$. If $T:Vto W$ is a linear map, then its dual is $T^*:W^*to V^*$ defined by $T^*(lambda) = lambda circ T$. Also if ${v_i}$ and ${w_j}$ are bases for $V$ and $W$ with dual bases ${v_i^*}$ and ${w_j^*}$, then it is easy to show that the matrix for $T$ with respect to the $v_i$ and $w_j$s is $A_{ij}=w_j^*Tv_i$, and the matrix for $T^*$ with respect to $w_j^*$ and $v_i^*$ is $B_{ji}=(T^*w_j^*)v_i=(w_j^*circ T)v_i=w_j^*Tv_i=A_{ij}$. Thus the matrix for $T^*$ is the transpose of the matrix for $T$.



Now onto the proof



Proof



Let $T:Vto W$. Let $A$ be any matrix representing this map. Then $T$ is surjective if and only if there are no nonzero linear functionals on $W$ that vanish on the image of $T$. However if $lambda$ vanishes on the image of $T$, then we have $T^*lambda=0$, and recalling that the matrix of $T^*$ is $A^t$, we see that the coordinates of $lambda$ give an explicit linear dependence among the columns of $A^t$ (which are the rows of $A$). Conversely any linear dependence among the rows of $A$ gives such a linear functional $lambda$, proving that $T$ is not surjective.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 2 days ago

























answered Nov 20 at 1:29









jgon

9,76611538




9,76611538












  • I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
    – James Shapiro
    Nov 20 at 1:35










  • @JamesShapiro That's also a reasonable direction of proof.
    – jgon
    Nov 20 at 1:38


















  • I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
    – James Shapiro
    Nov 20 at 1:35










  • @JamesShapiro That's also a reasonable direction of proof.
    – jgon
    Nov 20 at 1:38
















I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35




I think that works. It might also be worth noting that for an m x n matrix to be surjective, we must have that m <= n (i.e. it must be a fat matrix). Hence, the rank of the matrix is at most min(m, n) = m. And if the rows are linearly dependent, then we have that the rank is strictly less than m.
– James Shapiro
Nov 20 at 1:35












@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38




@JamesShapiro That's also a reasonable direction of proof.
– jgon
Nov 20 at 1:38










up vote
0
down vote













For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.






share|cite|improve this answer





















  • Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
    – James Shapiro
    Nov 20 at 1:24

















up vote
0
down vote













For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.






share|cite|improve this answer





















  • Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
    – James Shapiro
    Nov 20 at 1:24















up vote
0
down vote










up vote
0
down vote









For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.






share|cite|improve this answer












For a matrix A to be onto, there has to be a pivot in every row. To test the linear independence of the rows, you can look at A$^T$ and row reduce. If every column of A$^T$ has a pivot, the columns are linearly independent. Note that the columns of A$^T$ are the rows of A. I leave it to you to show that the row reduction doesn't affect the linear independence.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Nov 20 at 1:23









Joel Pereira

3265




3265












  • Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
    – James Shapiro
    Nov 20 at 1:24




















  • Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
    – James Shapiro
    Nov 20 at 1:24


















Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24






Okay, I follow that argument and was aware of it before I asked the question. But that's not exactly a satisfying formal proof, in my opinion.
– James Shapiro
Nov 20 at 1:24




















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005793%2fsimple-proof-of-the-following-a-matrix-a-is-onto-if-and-only-if-its-rows-are%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

Marschland

Dieringhausen