How would a composite variable be strongly correlated with one variable but not the other?












2












$begingroup$


I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.



However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    A scatter plot matrix should help.
    $endgroup$
    – Nick Cox
    Dec 11 '18 at 20:37






  • 2




    $begingroup$
    Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
    $endgroup$
    – sds
    Dec 11 '18 at 21:07










  • $begingroup$
    I have a vague memory of an even closer duplicate but I can not find it.
    $endgroup$
    – Martijn Weterings
    Dec 13 '18 at 14:07


















2












$begingroup$


I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.



However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    A scatter plot matrix should help.
    $endgroup$
    – Nick Cox
    Dec 11 '18 at 20:37






  • 2




    $begingroup$
    Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
    $endgroup$
    – sds
    Dec 11 '18 at 21:07










  • $begingroup$
    I have a vague memory of an even closer duplicate but I can not find it.
    $endgroup$
    – Martijn Weterings
    Dec 13 '18 at 14:07
















2












2








2





$begingroup$


I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.



However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?










share|cite|improve this question











$endgroup$




I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.



However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?







correlation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 11 '18 at 20:37









Nick Cox

38.4k483128




38.4k483128










asked Dec 11 '18 at 15:30









hlineehlinee

1147




1147








  • 2




    $begingroup$
    A scatter plot matrix should help.
    $endgroup$
    – Nick Cox
    Dec 11 '18 at 20:37






  • 2




    $begingroup$
    Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
    $endgroup$
    – sds
    Dec 11 '18 at 21:07










  • $begingroup$
    I have a vague memory of an even closer duplicate but I can not find it.
    $endgroup$
    – Martijn Weterings
    Dec 13 '18 at 14:07
















  • 2




    $begingroup$
    A scatter plot matrix should help.
    $endgroup$
    – Nick Cox
    Dec 11 '18 at 20:37






  • 2




    $begingroup$
    Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
    $endgroup$
    – sds
    Dec 11 '18 at 21:07










  • $begingroup$
    I have a vague memory of an even closer duplicate but I can not find it.
    $endgroup$
    – Martijn Weterings
    Dec 13 '18 at 14:07










2




2




$begingroup$
A scatter plot matrix should help.
$endgroup$
– Nick Cox
Dec 11 '18 at 20:37




$begingroup$
A scatter plot matrix should help.
$endgroup$
– Nick Cox
Dec 11 '18 at 20:37




2




2




$begingroup$
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
$endgroup$
– sds
Dec 11 '18 at 21:07




$begingroup$
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
$endgroup$
– sds
Dec 11 '18 at 21:07












$begingroup$
I have a vague memory of an even closer duplicate but I can not find it.
$endgroup$
– Martijn Weterings
Dec 13 '18 at 14:07






$begingroup$
I have a vague memory of an even closer duplicate but I can not find it.
$endgroup$
– Martijn Weterings
Dec 13 '18 at 14:07












4 Answers
4






active

oldest

votes


















13












$begingroup$

Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.






share|cite|improve this answer











$endgroup$





















    2












    $begingroup$

    This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.



    You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      By "covariation", do you mean "covariance"?
      $endgroup$
      – Kodiologist
      Dec 11 '18 at 16:53










    • $begingroup$
      @Kodiologist Are the two interchangeable? Or do they mean different things?
      $endgroup$
      – Cowthulhu
      Dec 12 '18 at 14:44










    • $begingroup$
      @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
      $endgroup$
      – Kodiologist
      Dec 12 '18 at 14:50










    • $begingroup$
      @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
      $endgroup$
      – Cowthulhu
      Dec 12 '18 at 14:59



















    2












    $begingroup$

    You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.






    share|cite|improve this answer









    $endgroup$





















      2












      $begingroup$

      Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
      Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$



      $Cov(X_1,X_3)=sigma_1^2-sigma_{12}$



      $Cov(X_2,X_3) =sigma_{12}-sigma_2^2$



      $Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



      $Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



      So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$



      This relation cannot be determined by correlation coefficient.






      share|cite|improve this answer











      $endgroup$













        Your Answer





        StackExchange.ifUsing("editor", function () {
        return StackExchange.using("mathjaxEditing", function () {
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        });
        });
        }, "mathjax-editing");

        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "65"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381477%2fhow-would-a-composite-variable-be-strongly-correlated-with-one-variable-but-not%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        13












        $begingroup$

        Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.






        share|cite|improve this answer











        $endgroup$


















          13












          $begingroup$

          Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.






          share|cite|improve this answer











          $endgroup$
















            13












            13








            13





            $begingroup$

            Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.






            share|cite|improve this answer











            $endgroup$



            Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Dec 11 '18 at 16:54

























            answered Dec 11 '18 at 15:55









            KodiologistKodiologist

            16.8k22954




            16.8k22954

























                2












                $begingroup$

                This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.



                You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.






                share|cite|improve this answer









                $endgroup$









                • 1




                  $begingroup$
                  By "covariation", do you mean "covariance"?
                  $endgroup$
                  – Kodiologist
                  Dec 11 '18 at 16:53










                • $begingroup$
                  @Kodiologist Are the two interchangeable? Or do they mean different things?
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:44










                • $begingroup$
                  @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                  $endgroup$
                  – Kodiologist
                  Dec 12 '18 at 14:50










                • $begingroup$
                  @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:59
















                2












                $begingroup$

                This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.



                You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.






                share|cite|improve this answer









                $endgroup$









                • 1




                  $begingroup$
                  By "covariation", do you mean "covariance"?
                  $endgroup$
                  – Kodiologist
                  Dec 11 '18 at 16:53










                • $begingroup$
                  @Kodiologist Are the two interchangeable? Or do they mean different things?
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:44










                • $begingroup$
                  @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                  $endgroup$
                  – Kodiologist
                  Dec 12 '18 at 14:50










                • $begingroup$
                  @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:59














                2












                2








                2





                $begingroup$

                This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.



                You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.






                share|cite|improve this answer









                $endgroup$



                This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.



                You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Dec 11 '18 at 15:57









                Gkhan CebsGkhan Cebs

                311




                311








                • 1




                  $begingroup$
                  By "covariation", do you mean "covariance"?
                  $endgroup$
                  – Kodiologist
                  Dec 11 '18 at 16:53










                • $begingroup$
                  @Kodiologist Are the two interchangeable? Or do they mean different things?
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:44










                • $begingroup$
                  @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                  $endgroup$
                  – Kodiologist
                  Dec 12 '18 at 14:50










                • $begingroup$
                  @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:59














                • 1




                  $begingroup$
                  By "covariation", do you mean "covariance"?
                  $endgroup$
                  – Kodiologist
                  Dec 11 '18 at 16:53










                • $begingroup$
                  @Kodiologist Are the two interchangeable? Or do they mean different things?
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:44










                • $begingroup$
                  @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                  $endgroup$
                  – Kodiologist
                  Dec 12 '18 at 14:50










                • $begingroup$
                  @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                  $endgroup$
                  – Cowthulhu
                  Dec 12 '18 at 14:59








                1




                1




                $begingroup$
                By "covariation", do you mean "covariance"?
                $endgroup$
                – Kodiologist
                Dec 11 '18 at 16:53




                $begingroup$
                By "covariation", do you mean "covariance"?
                $endgroup$
                – Kodiologist
                Dec 11 '18 at 16:53












                $begingroup$
                @Kodiologist Are the two interchangeable? Or do they mean different things?
                $endgroup$
                – Cowthulhu
                Dec 12 '18 at 14:44




                $begingroup$
                @Kodiologist Are the two interchangeable? Or do they mean different things?
                $endgroup$
                – Cowthulhu
                Dec 12 '18 at 14:44












                $begingroup$
                @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                $endgroup$
                – Kodiologist
                Dec 12 '18 at 14:50




                $begingroup$
                @Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
                $endgroup$
                – Kodiologist
                Dec 12 '18 at 14:50












                $begingroup$
                @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                $endgroup$
                – Cowthulhu
                Dec 12 '18 at 14:59




                $begingroup$
                @Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
                $endgroup$
                – Cowthulhu
                Dec 12 '18 at 14:59











                2












                $begingroup$

                You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.






                share|cite|improve this answer









                $endgroup$


















                  2












                  $begingroup$

                  You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.






                  share|cite|improve this answer









                  $endgroup$
















                    2












                    2








                    2





                    $begingroup$

                    You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.






                    share|cite|improve this answer









                    $endgroup$



                    You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Dec 11 '18 at 20:29









                    AcccumulationAcccumulation

                    1,57626




                    1,57626























                        2












                        $begingroup$

                        Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
                        Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$



                        $Cov(X_1,X_3)=sigma_1^2-sigma_{12}$



                        $Cov(X_2,X_3) =sigma_{12}-sigma_2^2$



                        $Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                        $Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                        So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$



                        This relation cannot be determined by correlation coefficient.






                        share|cite|improve this answer











                        $endgroup$


















                          2












                          $begingroup$

                          Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
                          Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$



                          $Cov(X_1,X_3)=sigma_1^2-sigma_{12}$



                          $Cov(X_2,X_3) =sigma_{12}-sigma_2^2$



                          $Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                          $Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                          So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$



                          This relation cannot be determined by correlation coefficient.






                          share|cite|improve this answer











                          $endgroup$
















                            2












                            2








                            2





                            $begingroup$

                            Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
                            Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$



                            $Cov(X_1,X_3)=sigma_1^2-sigma_{12}$



                            $Cov(X_2,X_3) =sigma_{12}-sigma_2^2$



                            $Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                            $Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                            So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$



                            This relation cannot be determined by correlation coefficient.






                            share|cite|improve this answer











                            $endgroup$



                            Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
                            Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$



                            $Cov(X_1,X_3)=sigma_1^2-sigma_{12}$



                            $Cov(X_2,X_3) =sigma_{12}-sigma_2^2$



                            $Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                            $Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$



                            So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$



                            This relation cannot be determined by correlation coefficient.







                            share|cite|improve this answer














                            share|cite|improve this answer



                            share|cite|improve this answer








                            edited Dec 12 '18 at 2:02

























                            answered Dec 12 '18 at 0:03









                            user158565user158565

                            5,3921518




                            5,3921518






























                                draft saved

                                draft discarded




















































                                Thanks for contributing an answer to Cross Validated!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381477%2fhow-would-a-composite-variable-be-strongly-correlated-with-one-variable-but-not%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Wiesbaden

                                Marschland

                                Dieringhausen