Zero correlation does not imply independence
up vote
21
down vote
favorite
I just learned that when discussing variables, although independence implies zero correlation zero correlation does not necessarily imply independence.
While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence.
Can someone please give me an example so I can better understand this phenomenon?
Thanks in advance!
statistics
add a comment |
up vote
21
down vote
favorite
I just learned that when discussing variables, although independence implies zero correlation zero correlation does not necessarily imply independence.
While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence.
Can someone please give me an example so I can better understand this phenomenon?
Thanks in advance!
statistics
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19
add a comment |
up vote
21
down vote
favorite
up vote
21
down vote
favorite
I just learned that when discussing variables, although independence implies zero correlation zero correlation does not necessarily imply independence.
While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence.
Can someone please give me an example so I can better understand this phenomenon?
Thanks in advance!
statistics
I just learned that when discussing variables, although independence implies zero correlation zero correlation does not necessarily imply independence.
While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence.
Can someone please give me an example so I can better understand this phenomenon?
Thanks in advance!
statistics
statistics
edited Nov 25 at 14:43
amWhy
191k28223439
191k28223439
asked Jul 15 '13 at 18:58
user86403
106113
106113
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19
add a comment |
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19
add a comment |
5 Answers
5
active
oldest
votes
up vote
21
down vote
Consider the following betting game.
Flip a fair coin to determine the amount of your bet: if heads, you bet $1, if tails you bet $2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose $1; if you flip tails and then heads you win $2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).
$X$ and $Y$ have zero correlation. You can compute this explicitly, but it's basically the fact that you are playing a fair game no matter how much you bet. But they are not independent; indeed, if you know $Y$, then you know $X$ (if $Y = -2$, for instance, then $X$ has to be 2.) Explicitly, the probability that $Y=-2$ is $1/4$, and the probability that $X=2$ is $1/2$, but the probability that both occur is $1/4$, not $1/8$. (Indeed, in this game, there is no event with probability $1/8$.)
add a comment |
up vote
8
down vote
Zero correlation will indicate no linear dependency, however won't capture non-linearity. Typical example is uniform random variable $x,$ and $x^2$ over [-1,1] with zero mean. Correlation is zero but clearly not independent.
add a comment |
up vote
3
down vote
Let $X$ be any random variable. Let $P{I = 1} = P{I = -1} = 1/2$, with $I$ independent of $X$. Let $Y = IX$. (Thus, $Y = pm X$, each with probability $1/2$, independent of the value of $X$.) Then $X$ and $Y$ are uncorrelated but not independent. We could replace $I$ by any zero-mean random variable independent of $X$. [Could someone please tell me how to insert that first equation correctly?]
For curly braces, type{
and}
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.
– Nate Eldredge
Jul 15 '13 at 19:38
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
add a comment |
up vote
2
down vote
Consider these two physical variables:
- A random velocity $V$ of a vehicle along a straight road between towns A and B (towards B, velocity is positive, whereas towards A velocity is negative); and
- Kinetic energy $K = frac{1}{2}mV^2$ of the vehicle where $m$ is the mass of the vehicle.
Let's say velocity takes values between $-50$ and $+50$ miles an hour with equal probability, average velocity $0$. When velocity is $-50$ kinetic energy is $1250m$ and when velocity is $+50$ kinetic energy is also $1250m$. Because the mean velocity is zero and all velocities are equally likely, the correlation is simply proportional to the sum of the products of velocity and kinetic energy (an integral rather than a sum, in fact, because velocity is continuous). The product $KV$ when $V=-50$ is $-62500m$, and the product $KV$ when $V=50$ is $62500m$, and these terms cancel each other out in the sum. And because negative velocities occur just as much as positive velocities, the sum is composed of equal and opposite pairs, which cancel out, and the correlation is zero.
add a comment |
up vote
2
down vote
I will give a geometric example involving random points in the plane. These come up in real life all the time if there is a mechanism by which points are distributed. (For example, it could be the location of a house or something)
Choose a random point $(X,Y)$ in the plane chosen uniformly from the unit circle $x^2 + y^2 = 1$ (by this I mean, the probability of $(X,Y)$ being contained in an arc of the circle is proportional to the length of the arc...you could also choose $theta$ uniformly distributed in $[0,2pi)$ and put $X=cos(theta), Y=sin (theta)$)
Now, the random variables $X$ and $Y$ are uncorrelated. Indeed, for any given value of $X=x$ there are always exactly two possible values of $Y$ that fit, namely $+sqrt{1-x^2}$ and $-sqrt{1-x^2}$. These are equally likely so both have probability $frac{1}{2}$. Hence $E(XY|X=x) = frac{1}{2}xsqrt{1-x^2}+frac{1}{2}x (-sqrt{1-x^2})=0$. From here, you should be able to see that they are uncorrelated.
However these are not independenet! There are many ways to see why. Here is one "certificate" that shows they are not independent. (Although this doesn't really clear up the intuition of why they arent independent, you will have to think about that one).
Notice $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=0$ since $X^2+Y^2=1$ always. However, each probability $P(X>frac{sqrt{2}}{2})$ and $P(Y>frac{sqrt{2}}{2})$ are non-zero, so it is impossible that $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=P(X>frac{sqrt{2}}{2})P(Y>frac{sqrt{2}}{2})$
add a comment |
protected by J. M. is not a mathematician Dec 29 '16 at 8:42
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
21
down vote
Consider the following betting game.
Flip a fair coin to determine the amount of your bet: if heads, you bet $1, if tails you bet $2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose $1; if you flip tails and then heads you win $2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).
$X$ and $Y$ have zero correlation. You can compute this explicitly, but it's basically the fact that you are playing a fair game no matter how much you bet. But they are not independent; indeed, if you know $Y$, then you know $X$ (if $Y = -2$, for instance, then $X$ has to be 2.) Explicitly, the probability that $Y=-2$ is $1/4$, and the probability that $X=2$ is $1/2$, but the probability that both occur is $1/4$, not $1/8$. (Indeed, in this game, there is no event with probability $1/8$.)
add a comment |
up vote
21
down vote
Consider the following betting game.
Flip a fair coin to determine the amount of your bet: if heads, you bet $1, if tails you bet $2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose $1; if you flip tails and then heads you win $2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).
$X$ and $Y$ have zero correlation. You can compute this explicitly, but it's basically the fact that you are playing a fair game no matter how much you bet. But they are not independent; indeed, if you know $Y$, then you know $X$ (if $Y = -2$, for instance, then $X$ has to be 2.) Explicitly, the probability that $Y=-2$ is $1/4$, and the probability that $X=2$ is $1/2$, but the probability that both occur is $1/4$, not $1/8$. (Indeed, in this game, there is no event with probability $1/8$.)
add a comment |
up vote
21
down vote
up vote
21
down vote
Consider the following betting game.
Flip a fair coin to determine the amount of your bet: if heads, you bet $1, if tails you bet $2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose $1; if you flip tails and then heads you win $2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).
$X$ and $Y$ have zero correlation. You can compute this explicitly, but it's basically the fact that you are playing a fair game no matter how much you bet. But they are not independent; indeed, if you know $Y$, then you know $X$ (if $Y = -2$, for instance, then $X$ has to be 2.) Explicitly, the probability that $Y=-2$ is $1/4$, and the probability that $X=2$ is $1/2$, but the probability that both occur is $1/4$, not $1/8$. (Indeed, in this game, there is no event with probability $1/8$.)
Consider the following betting game.
Flip a fair coin to determine the amount of your bet: if heads, you bet $1, if tails you bet $2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose $1; if you flip tails and then heads you win $2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).
$X$ and $Y$ have zero correlation. You can compute this explicitly, but it's basically the fact that you are playing a fair game no matter how much you bet. But they are not independent; indeed, if you know $Y$, then you know $X$ (if $Y = -2$, for instance, then $X$ has to be 2.) Explicitly, the probability that $Y=-2$ is $1/4$, and the probability that $X=2$ is $1/2$, but the probability that both occur is $1/4$, not $1/8$. (Indeed, in this game, there is no event with probability $1/8$.)
edited Aug 15 '14 at 22:29
answered Jul 15 '13 at 19:26
Nate Eldredge
61.7k680167
61.7k680167
add a comment |
add a comment |
up vote
8
down vote
Zero correlation will indicate no linear dependency, however won't capture non-linearity. Typical example is uniform random variable $x,$ and $x^2$ over [-1,1] with zero mean. Correlation is zero but clearly not independent.
add a comment |
up vote
8
down vote
Zero correlation will indicate no linear dependency, however won't capture non-linearity. Typical example is uniform random variable $x,$ and $x^2$ over [-1,1] with zero mean. Correlation is zero but clearly not independent.
add a comment |
up vote
8
down vote
up vote
8
down vote
Zero correlation will indicate no linear dependency, however won't capture non-linearity. Typical example is uniform random variable $x,$ and $x^2$ over [-1,1] with zero mean. Correlation is zero but clearly not independent.
Zero correlation will indicate no linear dependency, however won't capture non-linearity. Typical example is uniform random variable $x,$ and $x^2$ over [-1,1] with zero mean. Correlation is zero but clearly not independent.
edited Aug 8 '13 at 20:52
answered Aug 8 '13 at 20:42
karakfa
1,923811
1,923811
add a comment |
add a comment |
up vote
3
down vote
Let $X$ be any random variable. Let $P{I = 1} = P{I = -1} = 1/2$, with $I$ independent of $X$. Let $Y = IX$. (Thus, $Y = pm X$, each with probability $1/2$, independent of the value of $X$.) Then $X$ and $Y$ are uncorrelated but not independent. We could replace $I$ by any zero-mean random variable independent of $X$. [Could someone please tell me how to insert that first equation correctly?]
For curly braces, type{
and}
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.
– Nate Eldredge
Jul 15 '13 at 19:38
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
add a comment |
up vote
3
down vote
Let $X$ be any random variable. Let $P{I = 1} = P{I = -1} = 1/2$, with $I$ independent of $X$. Let $Y = IX$. (Thus, $Y = pm X$, each with probability $1/2$, independent of the value of $X$.) Then $X$ and $Y$ are uncorrelated but not independent. We could replace $I$ by any zero-mean random variable independent of $X$. [Could someone please tell me how to insert that first equation correctly?]
For curly braces, type{
and}
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.
– Nate Eldredge
Jul 15 '13 at 19:38
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
add a comment |
up vote
3
down vote
up vote
3
down vote
Let $X$ be any random variable. Let $P{I = 1} = P{I = -1} = 1/2$, with $I$ independent of $X$. Let $Y = IX$. (Thus, $Y = pm X$, each with probability $1/2$, independent of the value of $X$.) Then $X$ and $Y$ are uncorrelated but not independent. We could replace $I$ by any zero-mean random variable independent of $X$. [Could someone please tell me how to insert that first equation correctly?]
Let $X$ be any random variable. Let $P{I = 1} = P{I = -1} = 1/2$, with $I$ independent of $X$. Let $Y = IX$. (Thus, $Y = pm X$, each with probability $1/2$, independent of the value of $X$.) Then $X$ and $Y$ are uncorrelated but not independent. We could replace $I$ by any zero-mean random variable independent of $X$. [Could someone please tell me how to insert that first equation correctly?]
edited Jul 15 '13 at 19:37
Nate Eldredge
61.7k680167
61.7k680167
answered Jul 15 '13 at 19:32
Stephen Herschkorn
704312
704312
For curly braces, type{
and}
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.
– Nate Eldredge
Jul 15 '13 at 19:38
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
add a comment |
For curly braces, type{
and}
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.
– Nate Eldredge
Jul 15 '13 at 19:38
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
For curly braces, type
{
and }
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.– Nate Eldredge
Jul 15 '13 at 19:38
For curly braces, type
{
and }
. I edited them in for you. But I think most people write parentheses instead: $P(I = 1)$ etc.– Nate Eldredge
Jul 15 '13 at 19:38
1
1
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, my example is of this form.
– Nate Eldredge
Jul 15 '13 at 19:39
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
Incidentally, if X is symmetric Bernoulli, then (X,Y) is independent (hence some more care should be brought to the idea).
– Did
Dec 29 '16 at 8:35
add a comment |
up vote
2
down vote
Consider these two physical variables:
- A random velocity $V$ of a vehicle along a straight road between towns A and B (towards B, velocity is positive, whereas towards A velocity is negative); and
- Kinetic energy $K = frac{1}{2}mV^2$ of the vehicle where $m$ is the mass of the vehicle.
Let's say velocity takes values between $-50$ and $+50$ miles an hour with equal probability, average velocity $0$. When velocity is $-50$ kinetic energy is $1250m$ and when velocity is $+50$ kinetic energy is also $1250m$. Because the mean velocity is zero and all velocities are equally likely, the correlation is simply proportional to the sum of the products of velocity and kinetic energy (an integral rather than a sum, in fact, because velocity is continuous). The product $KV$ when $V=-50$ is $-62500m$, and the product $KV$ when $V=50$ is $62500m$, and these terms cancel each other out in the sum. And because negative velocities occur just as much as positive velocities, the sum is composed of equal and opposite pairs, which cancel out, and the correlation is zero.
add a comment |
up vote
2
down vote
Consider these two physical variables:
- A random velocity $V$ of a vehicle along a straight road between towns A and B (towards B, velocity is positive, whereas towards A velocity is negative); and
- Kinetic energy $K = frac{1}{2}mV^2$ of the vehicle where $m$ is the mass of the vehicle.
Let's say velocity takes values between $-50$ and $+50$ miles an hour with equal probability, average velocity $0$. When velocity is $-50$ kinetic energy is $1250m$ and when velocity is $+50$ kinetic energy is also $1250m$. Because the mean velocity is zero and all velocities are equally likely, the correlation is simply proportional to the sum of the products of velocity and kinetic energy (an integral rather than a sum, in fact, because velocity is continuous). The product $KV$ when $V=-50$ is $-62500m$, and the product $KV$ when $V=50$ is $62500m$, and these terms cancel each other out in the sum. And because negative velocities occur just as much as positive velocities, the sum is composed of equal and opposite pairs, which cancel out, and the correlation is zero.
add a comment |
up vote
2
down vote
up vote
2
down vote
Consider these two physical variables:
- A random velocity $V$ of a vehicle along a straight road between towns A and B (towards B, velocity is positive, whereas towards A velocity is negative); and
- Kinetic energy $K = frac{1}{2}mV^2$ of the vehicle where $m$ is the mass of the vehicle.
Let's say velocity takes values between $-50$ and $+50$ miles an hour with equal probability, average velocity $0$. When velocity is $-50$ kinetic energy is $1250m$ and when velocity is $+50$ kinetic energy is also $1250m$. Because the mean velocity is zero and all velocities are equally likely, the correlation is simply proportional to the sum of the products of velocity and kinetic energy (an integral rather than a sum, in fact, because velocity is continuous). The product $KV$ when $V=-50$ is $-62500m$, and the product $KV$ when $V=50$ is $62500m$, and these terms cancel each other out in the sum. And because negative velocities occur just as much as positive velocities, the sum is composed of equal and opposite pairs, which cancel out, and the correlation is zero.
Consider these two physical variables:
- A random velocity $V$ of a vehicle along a straight road between towns A and B (towards B, velocity is positive, whereas towards A velocity is negative); and
- Kinetic energy $K = frac{1}{2}mV^2$ of the vehicle where $m$ is the mass of the vehicle.
Let's say velocity takes values between $-50$ and $+50$ miles an hour with equal probability, average velocity $0$. When velocity is $-50$ kinetic energy is $1250m$ and when velocity is $+50$ kinetic energy is also $1250m$. Because the mean velocity is zero and all velocities are equally likely, the correlation is simply proportional to the sum of the products of velocity and kinetic energy (an integral rather than a sum, in fact, because velocity is continuous). The product $KV$ when $V=-50$ is $-62500m$, and the product $KV$ when $V=50$ is $62500m$, and these terms cancel each other out in the sum. And because negative velocities occur just as much as positive velocities, the sum is composed of equal and opposite pairs, which cancel out, and the correlation is zero.
answered Jul 15 '13 at 19:46
TooTone
5,02511741
5,02511741
add a comment |
add a comment |
up vote
2
down vote
I will give a geometric example involving random points in the plane. These come up in real life all the time if there is a mechanism by which points are distributed. (For example, it could be the location of a house or something)
Choose a random point $(X,Y)$ in the plane chosen uniformly from the unit circle $x^2 + y^2 = 1$ (by this I mean, the probability of $(X,Y)$ being contained in an arc of the circle is proportional to the length of the arc...you could also choose $theta$ uniformly distributed in $[0,2pi)$ and put $X=cos(theta), Y=sin (theta)$)
Now, the random variables $X$ and $Y$ are uncorrelated. Indeed, for any given value of $X=x$ there are always exactly two possible values of $Y$ that fit, namely $+sqrt{1-x^2}$ and $-sqrt{1-x^2}$. These are equally likely so both have probability $frac{1}{2}$. Hence $E(XY|X=x) = frac{1}{2}xsqrt{1-x^2}+frac{1}{2}x (-sqrt{1-x^2})=0$. From here, you should be able to see that they are uncorrelated.
However these are not independenet! There are many ways to see why. Here is one "certificate" that shows they are not independent. (Although this doesn't really clear up the intuition of why they arent independent, you will have to think about that one).
Notice $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=0$ since $X^2+Y^2=1$ always. However, each probability $P(X>frac{sqrt{2}}{2})$ and $P(Y>frac{sqrt{2}}{2})$ are non-zero, so it is impossible that $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=P(X>frac{sqrt{2}}{2})P(Y>frac{sqrt{2}}{2})$
add a comment |
up vote
2
down vote
I will give a geometric example involving random points in the plane. These come up in real life all the time if there is a mechanism by which points are distributed. (For example, it could be the location of a house or something)
Choose a random point $(X,Y)$ in the plane chosen uniformly from the unit circle $x^2 + y^2 = 1$ (by this I mean, the probability of $(X,Y)$ being contained in an arc of the circle is proportional to the length of the arc...you could also choose $theta$ uniformly distributed in $[0,2pi)$ and put $X=cos(theta), Y=sin (theta)$)
Now, the random variables $X$ and $Y$ are uncorrelated. Indeed, for any given value of $X=x$ there are always exactly two possible values of $Y$ that fit, namely $+sqrt{1-x^2}$ and $-sqrt{1-x^2}$. These are equally likely so both have probability $frac{1}{2}$. Hence $E(XY|X=x) = frac{1}{2}xsqrt{1-x^2}+frac{1}{2}x (-sqrt{1-x^2})=0$. From here, you should be able to see that they are uncorrelated.
However these are not independenet! There are many ways to see why. Here is one "certificate" that shows they are not independent. (Although this doesn't really clear up the intuition of why they arent independent, you will have to think about that one).
Notice $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=0$ since $X^2+Y^2=1$ always. However, each probability $P(X>frac{sqrt{2}}{2})$ and $P(Y>frac{sqrt{2}}{2})$ are non-zero, so it is impossible that $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=P(X>frac{sqrt{2}}{2})P(Y>frac{sqrt{2}}{2})$
add a comment |
up vote
2
down vote
up vote
2
down vote
I will give a geometric example involving random points in the plane. These come up in real life all the time if there is a mechanism by which points are distributed. (For example, it could be the location of a house or something)
Choose a random point $(X,Y)$ in the plane chosen uniformly from the unit circle $x^2 + y^2 = 1$ (by this I mean, the probability of $(X,Y)$ being contained in an arc of the circle is proportional to the length of the arc...you could also choose $theta$ uniformly distributed in $[0,2pi)$ and put $X=cos(theta), Y=sin (theta)$)
Now, the random variables $X$ and $Y$ are uncorrelated. Indeed, for any given value of $X=x$ there are always exactly two possible values of $Y$ that fit, namely $+sqrt{1-x^2}$ and $-sqrt{1-x^2}$. These are equally likely so both have probability $frac{1}{2}$. Hence $E(XY|X=x) = frac{1}{2}xsqrt{1-x^2}+frac{1}{2}x (-sqrt{1-x^2})=0$. From here, you should be able to see that they are uncorrelated.
However these are not independenet! There are many ways to see why. Here is one "certificate" that shows they are not independent. (Although this doesn't really clear up the intuition of why they arent independent, you will have to think about that one).
Notice $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=0$ since $X^2+Y^2=1$ always. However, each probability $P(X>frac{sqrt{2}}{2})$ and $P(Y>frac{sqrt{2}}{2})$ are non-zero, so it is impossible that $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=P(X>frac{sqrt{2}}{2})P(Y>frac{sqrt{2}}{2})$
I will give a geometric example involving random points in the plane. These come up in real life all the time if there is a mechanism by which points are distributed. (For example, it could be the location of a house or something)
Choose a random point $(X,Y)$ in the plane chosen uniformly from the unit circle $x^2 + y^2 = 1$ (by this I mean, the probability of $(X,Y)$ being contained in an arc of the circle is proportional to the length of the arc...you could also choose $theta$ uniformly distributed in $[0,2pi)$ and put $X=cos(theta), Y=sin (theta)$)
Now, the random variables $X$ and $Y$ are uncorrelated. Indeed, for any given value of $X=x$ there are always exactly two possible values of $Y$ that fit, namely $+sqrt{1-x^2}$ and $-sqrt{1-x^2}$. These are equally likely so both have probability $frac{1}{2}$. Hence $E(XY|X=x) = frac{1}{2}xsqrt{1-x^2}+frac{1}{2}x (-sqrt{1-x^2})=0$. From here, you should be able to see that they are uncorrelated.
However these are not independenet! There are many ways to see why. Here is one "certificate" that shows they are not independent. (Although this doesn't really clear up the intuition of why they arent independent, you will have to think about that one).
Notice $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=0$ since $X^2+Y^2=1$ always. However, each probability $P(X>frac{sqrt{2}}{2})$ and $P(Y>frac{sqrt{2}}{2})$ are non-zero, so it is impossible that $P(X>frac{sqrt{2}}{2}, Y>frac{sqrt{2}}{2})=P(X>frac{sqrt{2}}{2})P(Y>frac{sqrt{2}}{2})$
answered Aug 8 '13 at 19:51
Mihai Nica
34119
34119
add a comment |
add a comment |
protected by J. M. is not a mathematician Dec 29 '16 at 8:42
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
mathforum.org/library/drmath/view/64808.html
– Charles
Jul 15 '13 at 19:19