If the system Ax=b is consistent for every n x 1 matrix in b, then A is invertible. [closed]












0














I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question















closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.













  • 1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    – John Brevik
    Oct 7 '15 at 2:23












  • I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    – John Brevik
    Oct 7 '15 at 11:54
















0














I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question















closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.













  • 1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    – John Brevik
    Oct 7 '15 at 2:23












  • I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    – John Brevik
    Oct 7 '15 at 11:54














0












0








0


1





I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?












share|cite|improve this question















I have trouble understanding the proof here. I spent an hour trying to understand it but I give up. Can anyone help me with it?









linear-algebra systems-of-equations






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 30 at 21:45









Martin Sleziak

44.6k7115270




44.6k7115270










asked Oct 7 '15 at 2:14









Zhi J Teoh

5619




5619




closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.




closed as off-topic by user1551, Saad, Adrian Keister, user10354138, José Carlos Santos Nov 30 at 23:59


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – user1551, Saad, Adrian Keister, user10354138, José Carlos Santos

If this question can be reworded to fit the rules in the help center, please edit the question.












  • 1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    – John Brevik
    Oct 7 '15 at 2:23












  • I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    – John Brevik
    Oct 7 '15 at 11:54


















  • 1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
    – John Brevik
    Oct 7 '15 at 2:23












  • I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
    – Zhi J Teoh
    Oct 7 '15 at 3:29










  • Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
    – John Brevik
    Oct 7 '15 at 11:54
















1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
– John Brevik
Oct 7 '15 at 2:23






1) Yes, that's exactly what it is. The point is that if you can solve $A{mathbf x}={mathbf b}$ for any ${mathbf b}$, you can solve it for ${mathbf b}$ equal to any of the various columns of the identity matrix. 2) They mean to build a matrix $C$ out of the columns that form the solutions in part 1). Since matrix multiplication is done one column at a time, that gives you a matrix that gives you all the right columns to get the identity matrix, so they produced an inverse for $A$.
– John Brevik
Oct 7 '15 at 2:23














I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
– Zhi J Teoh
Oct 7 '15 at 3:29




I think I kind of get it. In matrix C, it isn't just 1 x n matrix right? It's n x n matrix, where each column has n rows. Basically n unknowns. And we take matrix A multiply by this n column to give us a column on the identity matrix. And then we repeat again to get column of the identity matrix. And then we combine them to form an augmented matrix, which really is an identity matrix. And from there we prove the rest. Am I right?
– Zhi J Teoh
Oct 7 '15 at 3:29












Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
– John Brevik
Oct 7 '15 at 11:54




Yeah, I think that's pretty much the point. I wouldn't really call it an augmented matrix, but otherwise you're right on.
– John Brevik
Oct 7 '15 at 11:54










1 Answer
1






active

oldest

votes


















2














At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
begin{equation}
Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
end{equation}
is consistent. In other words, there exists a vector $x_1$ such that
begin{equation}
Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
end{equation}



And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
begin{equation}
Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
end{equation}
is consistent. In other words, there exists a vector $x_2$ such that
begin{equation}
Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
end{equation}
And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
begin{equation}
C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
end{equation}
You can check that $AC = I$, the identity matrix. So $A$ is invertible.






share|cite|improve this answer




























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2














    At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
    begin{equation}
    Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
    end{equation}
    is consistent. In other words, there exists a vector $x_1$ such that
    begin{equation}
    Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
    end{equation}



    And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
    begin{equation}
    Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
    end{equation}
    is consistent. In other words, there exists a vector $x_2$ such that
    begin{equation}
    Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
    end{equation}
    And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
    begin{equation}
    C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
    end{equation}
    You can check that $AC = I$, the identity matrix. So $A$ is invertible.






    share|cite|improve this answer


























      2














      At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
      begin{equation}
      Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
      end{equation}
      is consistent. In other words, there exists a vector $x_1$ such that
      begin{equation}
      Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
      end{equation}



      And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
      begin{equation}
      Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
      end{equation}
      is consistent. In other words, there exists a vector $x_2$ such that
      begin{equation}
      Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
      end{equation}
      And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
      begin{equation}
      C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
      end{equation}
      You can check that $AC = I$, the identity matrix. So $A$ is invertible.






      share|cite|improve this answer
























        2












        2








        2






        At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
        begin{equation}
        Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_1$ such that
        begin{equation}
        Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}



        And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
        begin{equation}
        Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_2$ such that
        begin{equation}
        Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}
        And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
        begin{equation}
        C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
        end{equation}
        You can check that $AC = I$, the identity matrix. So $A$ is invertible.






        share|cite|improve this answer












        At the beginning, they're saying that since $Ax = b$ is consistent no matter which vector $b$ is, then in particular the system
        begin{equation}
        Ax = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_1$ such that
        begin{equation}
        Ax_1 = begin{bmatrix} 1 \ 0 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}



        And, since $Ax = b$ is consistent no matter what $b$ is, it must be true that the system
        begin{equation}
        Ax = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}
        end{equation}
        is consistent. In other words, there exists a vector $x_2$ such that
        begin{equation}
        Ax_2 = begin{bmatrix} 0 \ 1 \ 0 \ vdots \ 0 end{bmatrix}.
        end{equation}
        And so on. Now take these vectors $x_1, x_2,ldots$ and make them the columns of a matrix $C$:
        begin{equation}
        C = begin{bmatrix} x_1 & x_2 & cdots & x_n end{bmatrix}.
        end{equation}
        You can check that $AC = I$, the identity matrix. So $A$ is invertible.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Oct 7 '15 at 2:25









        littleO

        29.2k644108




        29.2k644108















            Popular posts from this blog

            Wiesbaden

            Marschland

            Dieringhausen