How do I prove that these vectors form a basis?











up vote
0
down vote

favorite












This is the problem



I know that by definition of basis, the vectors v1 and v2 should span the entire subspace. Therefore, if the first constant is not equal to the second constant, and if both of the constants give a linear transformation, then they must be linearly independent and therefore must form a basis. Is that the correct proof, or am I missing this? Also I don't know what the matrix of the linear transformation is.










share|cite|improve this question


















  • 1




    "...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
    – Erick Wong
    Mar 16 '16 at 22:44












  • A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
    – Crostul
    Mar 16 '16 at 23:22















up vote
0
down vote

favorite












This is the problem



I know that by definition of basis, the vectors v1 and v2 should span the entire subspace. Therefore, if the first constant is not equal to the second constant, and if both of the constants give a linear transformation, then they must be linearly independent and therefore must form a basis. Is that the correct proof, or am I missing this? Also I don't know what the matrix of the linear transformation is.










share|cite|improve this question


















  • 1




    "...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
    – Erick Wong
    Mar 16 '16 at 22:44












  • A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
    – Crostul
    Mar 16 '16 at 23:22













up vote
0
down vote

favorite









up vote
0
down vote

favorite











This is the problem



I know that by definition of basis, the vectors v1 and v2 should span the entire subspace. Therefore, if the first constant is not equal to the second constant, and if both of the constants give a linear transformation, then they must be linearly independent and therefore must form a basis. Is that the correct proof, or am I missing this? Also I don't know what the matrix of the linear transformation is.










share|cite|improve this question













This is the problem



I know that by definition of basis, the vectors v1 and v2 should span the entire subspace. Therefore, if the first constant is not equal to the second constant, and if both of the constants give a linear transformation, then they must be linearly independent and therefore must form a basis. Is that the correct proof, or am I missing this? Also I don't know what the matrix of the linear transformation is.







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Mar 16 '16 at 22:41









Jeansandcofffeee

395




395








  • 1




    "...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
    – Erick Wong
    Mar 16 '16 at 22:44












  • A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
    – Crostul
    Mar 16 '16 at 23:22














  • 1




    "...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
    – Erick Wong
    Mar 16 '16 at 22:44












  • A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
    – Crostul
    Mar 16 '16 at 23:22








1




1




"...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
– Erick Wong
Mar 16 '16 at 22:44






"...then they must be linearly independent". I don't see where you justify this claim. And what is "they"? You want to prove the vectors $v_1,v_2$ are linearly independent, but you only refer to constants and linear transformations in the argument leading up to that. There's no place where you have considered anything about the vectors.
– Erick Wong
Mar 16 '16 at 22:44














A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
– Crostul
Mar 16 '16 at 23:22




A standard exercize shows that two eigenvectors relative to two distinct eigenvalues must be linearly independent. If you are using this fact, then everything becomes obvious; but if you don't, then you are using some unjustified argument.
– Crostul
Mar 16 '16 at 23:22










1 Answer
1






active

oldest

votes

















up vote
1
down vote













Suppose that $c_1 vec v_1 + c_2 vec v_2 = 0.$ Multiply both sides by $A$ to get $c_1 lambda_1 vec v_1 + c_2 lambda_2 vec v_2 = 0.$ Multiply the first equation by $lambda_1,$ and subtract it from the the second to get $(lambda_2 - lambda_1) c_2 vec v_2 = 0.$ Because the $lambda$s are distinct and $vec v_2$ is not zero, we must have $c_2 = 0.$ By a similar argument, $c_1 = 0.$ Hence $vec v_1, vec v_2$ are linearly independent. Now, there are two of them, and the space $Bbb R^2$ has exactly two dimensions, so $vec v_1, vec v_2$ form a basis of $Bbb R^2.$



The $i$th column of the matrix of $T_A$ is $[T_A vec v_i]_{mathcal B}$ where $[ cdot ]_{mathcal B}$ denotes the coordinate matrix relative to $mathcal B.$ We compute
$$begin{align}
[T_A vec v_1]_{mathcal B} & = [A vec v_1]_{mathcal B} = [lambda_1 vec v_1]_{mathcal B} = [lambda_1 vec v_1 + 0 cdot vec v_2]_{mathcal B} = begin{bmatrix} lambda_1\ 0end{bmatrix}\\
[T_A vec v_2]_{mathcal B} & = [A vec v_2]_{mathcal B} = [lambda_2 vec v_2]_{mathcal B} = [0 cdot vec v_1 + lambda_2 vec v_2]_{mathcal B} = begin{bmatrix} 0\ lambda_2end{bmatrix}.
end{align}$$

Thus, the matrix of $T_A$ is
$$begin{bmatrix} lambda_1 & 0\ 0 & lambda_2end{bmatrix}.$$






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1700915%2fhow-do-i-prove-that-these-vectors-form-a-basis%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote













    Suppose that $c_1 vec v_1 + c_2 vec v_2 = 0.$ Multiply both sides by $A$ to get $c_1 lambda_1 vec v_1 + c_2 lambda_2 vec v_2 = 0.$ Multiply the first equation by $lambda_1,$ and subtract it from the the second to get $(lambda_2 - lambda_1) c_2 vec v_2 = 0.$ Because the $lambda$s are distinct and $vec v_2$ is not zero, we must have $c_2 = 0.$ By a similar argument, $c_1 = 0.$ Hence $vec v_1, vec v_2$ are linearly independent. Now, there are two of them, and the space $Bbb R^2$ has exactly two dimensions, so $vec v_1, vec v_2$ form a basis of $Bbb R^2.$



    The $i$th column of the matrix of $T_A$ is $[T_A vec v_i]_{mathcal B}$ where $[ cdot ]_{mathcal B}$ denotes the coordinate matrix relative to $mathcal B.$ We compute
    $$begin{align}
    [T_A vec v_1]_{mathcal B} & = [A vec v_1]_{mathcal B} = [lambda_1 vec v_1]_{mathcal B} = [lambda_1 vec v_1 + 0 cdot vec v_2]_{mathcal B} = begin{bmatrix} lambda_1\ 0end{bmatrix}\\
    [T_A vec v_2]_{mathcal B} & = [A vec v_2]_{mathcal B} = [lambda_2 vec v_2]_{mathcal B} = [0 cdot vec v_1 + lambda_2 vec v_2]_{mathcal B} = begin{bmatrix} 0\ lambda_2end{bmatrix}.
    end{align}$$

    Thus, the matrix of $T_A$ is
    $$begin{bmatrix} lambda_1 & 0\ 0 & lambda_2end{bmatrix}.$$






    share|cite|improve this answer



























      up vote
      1
      down vote













      Suppose that $c_1 vec v_1 + c_2 vec v_2 = 0.$ Multiply both sides by $A$ to get $c_1 lambda_1 vec v_1 + c_2 lambda_2 vec v_2 = 0.$ Multiply the first equation by $lambda_1,$ and subtract it from the the second to get $(lambda_2 - lambda_1) c_2 vec v_2 = 0.$ Because the $lambda$s are distinct and $vec v_2$ is not zero, we must have $c_2 = 0.$ By a similar argument, $c_1 = 0.$ Hence $vec v_1, vec v_2$ are linearly independent. Now, there are two of them, and the space $Bbb R^2$ has exactly two dimensions, so $vec v_1, vec v_2$ form a basis of $Bbb R^2.$



      The $i$th column of the matrix of $T_A$ is $[T_A vec v_i]_{mathcal B}$ where $[ cdot ]_{mathcal B}$ denotes the coordinate matrix relative to $mathcal B.$ We compute
      $$begin{align}
      [T_A vec v_1]_{mathcal B} & = [A vec v_1]_{mathcal B} = [lambda_1 vec v_1]_{mathcal B} = [lambda_1 vec v_1 + 0 cdot vec v_2]_{mathcal B} = begin{bmatrix} lambda_1\ 0end{bmatrix}\\
      [T_A vec v_2]_{mathcal B} & = [A vec v_2]_{mathcal B} = [lambda_2 vec v_2]_{mathcal B} = [0 cdot vec v_1 + lambda_2 vec v_2]_{mathcal B} = begin{bmatrix} 0\ lambda_2end{bmatrix}.
      end{align}$$

      Thus, the matrix of $T_A$ is
      $$begin{bmatrix} lambda_1 & 0\ 0 & lambda_2end{bmatrix}.$$






      share|cite|improve this answer

























        up vote
        1
        down vote










        up vote
        1
        down vote









        Suppose that $c_1 vec v_1 + c_2 vec v_2 = 0.$ Multiply both sides by $A$ to get $c_1 lambda_1 vec v_1 + c_2 lambda_2 vec v_2 = 0.$ Multiply the first equation by $lambda_1,$ and subtract it from the the second to get $(lambda_2 - lambda_1) c_2 vec v_2 = 0.$ Because the $lambda$s are distinct and $vec v_2$ is not zero, we must have $c_2 = 0.$ By a similar argument, $c_1 = 0.$ Hence $vec v_1, vec v_2$ are linearly independent. Now, there are two of them, and the space $Bbb R^2$ has exactly two dimensions, so $vec v_1, vec v_2$ form a basis of $Bbb R^2.$



        The $i$th column of the matrix of $T_A$ is $[T_A vec v_i]_{mathcal B}$ where $[ cdot ]_{mathcal B}$ denotes the coordinate matrix relative to $mathcal B.$ We compute
        $$begin{align}
        [T_A vec v_1]_{mathcal B} & = [A vec v_1]_{mathcal B} = [lambda_1 vec v_1]_{mathcal B} = [lambda_1 vec v_1 + 0 cdot vec v_2]_{mathcal B} = begin{bmatrix} lambda_1\ 0end{bmatrix}\\
        [T_A vec v_2]_{mathcal B} & = [A vec v_2]_{mathcal B} = [lambda_2 vec v_2]_{mathcal B} = [0 cdot vec v_1 + lambda_2 vec v_2]_{mathcal B} = begin{bmatrix} 0\ lambda_2end{bmatrix}.
        end{align}$$

        Thus, the matrix of $T_A$ is
        $$begin{bmatrix} lambda_1 & 0\ 0 & lambda_2end{bmatrix}.$$






        share|cite|improve this answer














        Suppose that $c_1 vec v_1 + c_2 vec v_2 = 0.$ Multiply both sides by $A$ to get $c_1 lambda_1 vec v_1 + c_2 lambda_2 vec v_2 = 0.$ Multiply the first equation by $lambda_1,$ and subtract it from the the second to get $(lambda_2 - lambda_1) c_2 vec v_2 = 0.$ Because the $lambda$s are distinct and $vec v_2$ is not zero, we must have $c_2 = 0.$ By a similar argument, $c_1 = 0.$ Hence $vec v_1, vec v_2$ are linearly independent. Now, there are two of them, and the space $Bbb R^2$ has exactly two dimensions, so $vec v_1, vec v_2$ form a basis of $Bbb R^2.$



        The $i$th column of the matrix of $T_A$ is $[T_A vec v_i]_{mathcal B}$ where $[ cdot ]_{mathcal B}$ denotes the coordinate matrix relative to $mathcal B.$ We compute
        $$begin{align}
        [T_A vec v_1]_{mathcal B} & = [A vec v_1]_{mathcal B} = [lambda_1 vec v_1]_{mathcal B} = [lambda_1 vec v_1 + 0 cdot vec v_2]_{mathcal B} = begin{bmatrix} lambda_1\ 0end{bmatrix}\\
        [T_A vec v_2]_{mathcal B} & = [A vec v_2]_{mathcal B} = [lambda_2 vec v_2]_{mathcal B} = [0 cdot vec v_1 + lambda_2 vec v_2]_{mathcal B} = begin{bmatrix} 0\ lambda_2end{bmatrix}.
        end{align}$$

        Thus, the matrix of $T_A$ is
        $$begin{bmatrix} lambda_1 & 0\ 0 & lambda_2end{bmatrix}.$$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 25 at 14:39

























        answered Nov 24 at 21:38









        Maurice P

        1,3451732




        1,3451732






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1700915%2fhow-do-i-prove-that-these-vectors-form-a-basis%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Wiesbaden

            Marschland

            Dieringhausen