Using Adam's Law (Law of Total Expectation) to find expectation of residual












1












$begingroup$


This may seem like a rather simple question, but I haven't been able to come up with an explanation myself or find one on the Internet.






I've learned that Adam's Law states that



$$E(E(Y|X)) = E(Y)$$



While solving an exercise problem, I noticed that the solution used the fact that



$$E(E(Y|X)|X) = E(Y|X)$$






I'm a bit confused as to how this equality has been derived. The form of Adam's Law with extra conditioning I'm familiar with is



$$E(E(Y|X, Z)|Z) = E(Y|Z)$$



but there seems to be something missing in the equation that I've given.






Would anybody be able to help me understand the derivation?



Thank you.






EDIT



The textbook that I'm using is Introduction to Probability (1e) - Blitzstein & Hwang.



There isn't particularly a specific definition regarding the property that I mentioned. If I were to quote the textbook exactly:




Conditional expectation has some very useful properties, that often allow us to solve problems without having to go all the way back to the definition.



...



Theorem 9.3.7 (Adam's Law): For any random variables $X$ and $Y$,



$$E(E(Y|X)) = E(Y)$$



Adam's law with extra conditioning:



$$E(E(Y|X, Z) = E(Y|Z)$$




The remainder of the textbook related to the law is about how to prove the equality, which I understand how to do.






Here's the specific exercise problem that has led me to ask this question:




Let $X$ and $Y$ be random variables with finite variances, and let $W = Y - E(Y|X)$.



Compute $E(W|X)$.




My approach is as follows:



begin{align}
E(W|X) & = E(Y - E(Y|X) | X) \
& = E(Y|X) - E(E(Y|X)|X) \
end{align}



and this is the particular part that threw me off. The correct answer would be that $E(E(Y|X)|X) = E(Y|X)$ therefore giving us $0$.










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    $E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:36












  • $begingroup$
    Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
    $endgroup$
    – Seankala
    Dec 9 '18 at 13:38






  • 1




    $begingroup$
    It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:43






  • 1




    $begingroup$
    @Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
    $endgroup$
    – Did
    Dec 9 '18 at 14:14












  • $begingroup$
    Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
    $endgroup$
    – user587192
    Dec 9 '18 at 14:15


















1












$begingroup$


This may seem like a rather simple question, but I haven't been able to come up with an explanation myself or find one on the Internet.






I've learned that Adam's Law states that



$$E(E(Y|X)) = E(Y)$$



While solving an exercise problem, I noticed that the solution used the fact that



$$E(E(Y|X)|X) = E(Y|X)$$






I'm a bit confused as to how this equality has been derived. The form of Adam's Law with extra conditioning I'm familiar with is



$$E(E(Y|X, Z)|Z) = E(Y|Z)$$



but there seems to be something missing in the equation that I've given.






Would anybody be able to help me understand the derivation?



Thank you.






EDIT



The textbook that I'm using is Introduction to Probability (1e) - Blitzstein & Hwang.



There isn't particularly a specific definition regarding the property that I mentioned. If I were to quote the textbook exactly:




Conditional expectation has some very useful properties, that often allow us to solve problems without having to go all the way back to the definition.



...



Theorem 9.3.7 (Adam's Law): For any random variables $X$ and $Y$,



$$E(E(Y|X)) = E(Y)$$



Adam's law with extra conditioning:



$$E(E(Y|X, Z) = E(Y|Z)$$




The remainder of the textbook related to the law is about how to prove the equality, which I understand how to do.






Here's the specific exercise problem that has led me to ask this question:




Let $X$ and $Y$ be random variables with finite variances, and let $W = Y - E(Y|X)$.



Compute $E(W|X)$.




My approach is as follows:



begin{align}
E(W|X) & = E(Y - E(Y|X) | X) \
& = E(Y|X) - E(E(Y|X)|X) \
end{align}



and this is the particular part that threw me off. The correct answer would be that $E(E(Y|X)|X) = E(Y|X)$ therefore giving us $0$.










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    $E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:36












  • $begingroup$
    Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
    $endgroup$
    – Seankala
    Dec 9 '18 at 13:38






  • 1




    $begingroup$
    It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:43






  • 1




    $begingroup$
    @Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
    $endgroup$
    – Did
    Dec 9 '18 at 14:14












  • $begingroup$
    Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
    $endgroup$
    – user587192
    Dec 9 '18 at 14:15
















1












1








1





$begingroup$


This may seem like a rather simple question, but I haven't been able to come up with an explanation myself or find one on the Internet.






I've learned that Adam's Law states that



$$E(E(Y|X)) = E(Y)$$



While solving an exercise problem, I noticed that the solution used the fact that



$$E(E(Y|X)|X) = E(Y|X)$$






I'm a bit confused as to how this equality has been derived. The form of Adam's Law with extra conditioning I'm familiar with is



$$E(E(Y|X, Z)|Z) = E(Y|Z)$$



but there seems to be something missing in the equation that I've given.






Would anybody be able to help me understand the derivation?



Thank you.






EDIT



The textbook that I'm using is Introduction to Probability (1e) - Blitzstein & Hwang.



There isn't particularly a specific definition regarding the property that I mentioned. If I were to quote the textbook exactly:




Conditional expectation has some very useful properties, that often allow us to solve problems without having to go all the way back to the definition.



...



Theorem 9.3.7 (Adam's Law): For any random variables $X$ and $Y$,



$$E(E(Y|X)) = E(Y)$$



Adam's law with extra conditioning:



$$E(E(Y|X, Z) = E(Y|Z)$$




The remainder of the textbook related to the law is about how to prove the equality, which I understand how to do.






Here's the specific exercise problem that has led me to ask this question:




Let $X$ and $Y$ be random variables with finite variances, and let $W = Y - E(Y|X)$.



Compute $E(W|X)$.




My approach is as follows:



begin{align}
E(W|X) & = E(Y - E(Y|X) | X) \
& = E(Y|X) - E(E(Y|X)|X) \
end{align}



and this is the particular part that threw me off. The correct answer would be that $E(E(Y|X)|X) = E(Y|X)$ therefore giving us $0$.










share|cite|improve this question











$endgroup$




This may seem like a rather simple question, but I haven't been able to come up with an explanation myself or find one on the Internet.






I've learned that Adam's Law states that



$$E(E(Y|X)) = E(Y)$$



While solving an exercise problem, I noticed that the solution used the fact that



$$E(E(Y|X)|X) = E(Y|X)$$






I'm a bit confused as to how this equality has been derived. The form of Adam's Law with extra conditioning I'm familiar with is



$$E(E(Y|X, Z)|Z) = E(Y|Z)$$



but there seems to be something missing in the equation that I've given.






Would anybody be able to help me understand the derivation?



Thank you.






EDIT



The textbook that I'm using is Introduction to Probability (1e) - Blitzstein & Hwang.



There isn't particularly a specific definition regarding the property that I mentioned. If I were to quote the textbook exactly:




Conditional expectation has some very useful properties, that often allow us to solve problems without having to go all the way back to the definition.



...



Theorem 9.3.7 (Adam's Law): For any random variables $X$ and $Y$,



$$E(E(Y|X)) = E(Y)$$



Adam's law with extra conditioning:



$$E(E(Y|X, Z) = E(Y|Z)$$




The remainder of the textbook related to the law is about how to prove the equality, which I understand how to do.






Here's the specific exercise problem that has led me to ask this question:




Let $X$ and $Y$ be random variables with finite variances, and let $W = Y - E(Y|X)$.



Compute $E(W|X)$.




My approach is as follows:



begin{align}
E(W|X) & = E(Y - E(Y|X) | X) \
& = E(Y|X) - E(E(Y|X)|X) \
end{align}



and this is the particular part that threw me off. The correct answer would be that $E(E(Y|X)|X) = E(Y|X)$ therefore giving us $0$.







probability-theory conditional-expectation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 9 '18 at 15:23







Seankala

















asked Dec 9 '18 at 13:32









SeankalaSeankala

24710




24710








  • 2




    $begingroup$
    $E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:36












  • $begingroup$
    Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
    $endgroup$
    – Seankala
    Dec 9 '18 at 13:38






  • 1




    $begingroup$
    It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:43






  • 1




    $begingroup$
    @Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
    $endgroup$
    – Did
    Dec 9 '18 at 14:14












  • $begingroup$
    Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
    $endgroup$
    – user587192
    Dec 9 '18 at 14:15
















  • 2




    $begingroup$
    $E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:36












  • $begingroup$
    Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
    $endgroup$
    – Seankala
    Dec 9 '18 at 13:38






  • 1




    $begingroup$
    It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
    $endgroup$
    – GNUSupporter 8964民主女神 地下教會
    Dec 9 '18 at 13:43






  • 1




    $begingroup$
    @Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
    $endgroup$
    – Did
    Dec 9 '18 at 14:14












  • $begingroup$
    Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
    $endgroup$
    – user587192
    Dec 9 '18 at 14:15










2




2




$begingroup$
$E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
$endgroup$
– GNUSupporter 8964民主女神 地下教會
Dec 9 '18 at 13:36






$begingroup$
$E(Ymid X)$ is by definition measurable with respect to $sigma(X)$, so it can be directly taken out of $E(cdot mid X)$.
$endgroup$
– GNUSupporter 8964民主女神 地下教會
Dec 9 '18 at 13:36














$begingroup$
Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
$endgroup$
– Seankala
Dec 9 '18 at 13:38




$begingroup$
Hi. I'm not too familiar with $sigma (X)$... Could you post an explanation answer please?
$endgroup$
– Seankala
Dec 9 '18 at 13:38




1




1




$begingroup$
It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
$endgroup$
– GNUSupporter 8964民主女神 地下教會
Dec 9 '18 at 13:43




$begingroup$
It's well known that for a random variable $X:(Omega, Sigma) to (E, cal{E})$, $sigma(X) = {X^{-1}(B) mid B in cal{E}}$.
$endgroup$
– GNUSupporter 8964民主女神 地下教會
Dec 9 '18 at 13:43




1




1




$begingroup$
@Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
$endgroup$
– Did
Dec 9 '18 at 14:14






$begingroup$
@Seankala You seem to be mainly in need of a definition of $E(Ymid X)$. Do you know one?
$endgroup$
– Did
Dec 9 '18 at 14:14














$begingroup$
Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
$endgroup$
– user587192
Dec 9 '18 at 14:15






$begingroup$
Which definition of the conditional expectation $E(Ymid X)$ do you know? If you give the context of your exercise (what book, which chapter are you reading?), it would be easy to tell.
$endgroup$
– user587192
Dec 9 '18 at 14:15












2 Answers
2






active

oldest

votes


















1












$begingroup$

All you need it to read your book (the one by Blitzstein-Hwang you mentioned in the post) more thoroughly, really.



The definition you need is that of conditional expectations given a random variable. The first thing you should know is what the notation $E(Y|X)$ means before discussing anything about it. Section 9.2: Conditional expectation given an r.v. is the place you should refer to in the first place.




enter image description here




Note that the whole Section 9.3 is about "Properties of conditional expectation". In particular, you have




enter image description here




Applying Theorem 9.3.2, one has
$$
E(E(Y|X)|X)=E(g(X)cdot 1|X)=g(X)E(1|X)=g(X)cdot 1=g(X),
$$

where $g(X):=E(Y|X)$.





[Added:] Note that a rigorous measure theory based answer to your question would be very different. (Even Definition 9.2.1 and statement of Theorem 9.3.2 would change in that setting.)






share|cite|improve this answer











$endgroup$





















    1












    $begingroup$

    As commenters point out, a definition would clear things up. Rick Durrett's Probability (free online) presents $E(X|mathcal F)$ as the random variable $Z$ so that



    1) $Zinmathcal F$, and



    2) $int_A XdP=int_AZdP$ for all $Ainmathcal F$.



    To prove your fact, we have to show that



    1) $E(E(Y|X)|X)in sigma(X)$, and



    2) $int_AE(E(Y|X)|X)dP=int_AYdP$.



    The first part follows from the fact that $E(E(Y|X)|X)$ is a random variable that results from conditioning on $sigma(X)$. ($E(Y|X)$ means exactly $E(Y|sigma(X))$.)



    The second part follows from repeatedly applying property 2) for conditional expected values: For all $Ainsigma(X)$,



    $$int_AE(E(Y|X)|X)dP=int_AE(Y|X)dP=int_AYdP,$$
    with first equality by applying property 2) to the random variable that results from taking the expectation of $E(Y|X)$, conditioning on $sigma(X)$; the second equality follows from applying property 2) to the random variable that results from taking the expectation of $Y$, conditioning on $sigma(X)$.






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032383%2fusing-adams-law-law-of-total-expectation-to-find-expectation-of-residual%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      All you need it to read your book (the one by Blitzstein-Hwang you mentioned in the post) more thoroughly, really.



      The definition you need is that of conditional expectations given a random variable. The first thing you should know is what the notation $E(Y|X)$ means before discussing anything about it. Section 9.2: Conditional expectation given an r.v. is the place you should refer to in the first place.




      enter image description here




      Note that the whole Section 9.3 is about "Properties of conditional expectation". In particular, you have




      enter image description here




      Applying Theorem 9.3.2, one has
      $$
      E(E(Y|X)|X)=E(g(X)cdot 1|X)=g(X)E(1|X)=g(X)cdot 1=g(X),
      $$

      where $g(X):=E(Y|X)$.





      [Added:] Note that a rigorous measure theory based answer to your question would be very different. (Even Definition 9.2.1 and statement of Theorem 9.3.2 would change in that setting.)






      share|cite|improve this answer











      $endgroup$


















        1












        $begingroup$

        All you need it to read your book (the one by Blitzstein-Hwang you mentioned in the post) more thoroughly, really.



        The definition you need is that of conditional expectations given a random variable. The first thing you should know is what the notation $E(Y|X)$ means before discussing anything about it. Section 9.2: Conditional expectation given an r.v. is the place you should refer to in the first place.




        enter image description here




        Note that the whole Section 9.3 is about "Properties of conditional expectation". In particular, you have




        enter image description here




        Applying Theorem 9.3.2, one has
        $$
        E(E(Y|X)|X)=E(g(X)cdot 1|X)=g(X)E(1|X)=g(X)cdot 1=g(X),
        $$

        where $g(X):=E(Y|X)$.





        [Added:] Note that a rigorous measure theory based answer to your question would be very different. (Even Definition 9.2.1 and statement of Theorem 9.3.2 would change in that setting.)






        share|cite|improve this answer











        $endgroup$
















          1












          1








          1





          $begingroup$

          All you need it to read your book (the one by Blitzstein-Hwang you mentioned in the post) more thoroughly, really.



          The definition you need is that of conditional expectations given a random variable. The first thing you should know is what the notation $E(Y|X)$ means before discussing anything about it. Section 9.2: Conditional expectation given an r.v. is the place you should refer to in the first place.




          enter image description here




          Note that the whole Section 9.3 is about "Properties of conditional expectation". In particular, you have




          enter image description here




          Applying Theorem 9.3.2, one has
          $$
          E(E(Y|X)|X)=E(g(X)cdot 1|X)=g(X)E(1|X)=g(X)cdot 1=g(X),
          $$

          where $g(X):=E(Y|X)$.





          [Added:] Note that a rigorous measure theory based answer to your question would be very different. (Even Definition 9.2.1 and statement of Theorem 9.3.2 would change in that setting.)






          share|cite|improve this answer











          $endgroup$



          All you need it to read your book (the one by Blitzstein-Hwang you mentioned in the post) more thoroughly, really.



          The definition you need is that of conditional expectations given a random variable. The first thing you should know is what the notation $E(Y|X)$ means before discussing anything about it. Section 9.2: Conditional expectation given an r.v. is the place you should refer to in the first place.




          enter image description here




          Note that the whole Section 9.3 is about "Properties of conditional expectation". In particular, you have




          enter image description here




          Applying Theorem 9.3.2, one has
          $$
          E(E(Y|X)|X)=E(g(X)cdot 1|X)=g(X)E(1|X)=g(X)cdot 1=g(X),
          $$

          where $g(X):=E(Y|X)$.





          [Added:] Note that a rigorous measure theory based answer to your question would be very different. (Even Definition 9.2.1 and statement of Theorem 9.3.2 would change in that setting.)







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 10 '18 at 2:04

























          answered Dec 9 '18 at 16:57









          user587192user587192

          1,827315




          1,827315























              1












              $begingroup$

              As commenters point out, a definition would clear things up. Rick Durrett's Probability (free online) presents $E(X|mathcal F)$ as the random variable $Z$ so that



              1) $Zinmathcal F$, and



              2) $int_A XdP=int_AZdP$ for all $Ainmathcal F$.



              To prove your fact, we have to show that



              1) $E(E(Y|X)|X)in sigma(X)$, and



              2) $int_AE(E(Y|X)|X)dP=int_AYdP$.



              The first part follows from the fact that $E(E(Y|X)|X)$ is a random variable that results from conditioning on $sigma(X)$. ($E(Y|X)$ means exactly $E(Y|sigma(X))$.)



              The second part follows from repeatedly applying property 2) for conditional expected values: For all $Ainsigma(X)$,



              $$int_AE(E(Y|X)|X)dP=int_AE(Y|X)dP=int_AYdP,$$
              with first equality by applying property 2) to the random variable that results from taking the expectation of $E(Y|X)$, conditioning on $sigma(X)$; the second equality follows from applying property 2) to the random variable that results from taking the expectation of $Y$, conditioning on $sigma(X)$.






              share|cite|improve this answer









              $endgroup$


















                1












                $begingroup$

                As commenters point out, a definition would clear things up. Rick Durrett's Probability (free online) presents $E(X|mathcal F)$ as the random variable $Z$ so that



                1) $Zinmathcal F$, and



                2) $int_A XdP=int_AZdP$ for all $Ainmathcal F$.



                To prove your fact, we have to show that



                1) $E(E(Y|X)|X)in sigma(X)$, and



                2) $int_AE(E(Y|X)|X)dP=int_AYdP$.



                The first part follows from the fact that $E(E(Y|X)|X)$ is a random variable that results from conditioning on $sigma(X)$. ($E(Y|X)$ means exactly $E(Y|sigma(X))$.)



                The second part follows from repeatedly applying property 2) for conditional expected values: For all $Ainsigma(X)$,



                $$int_AE(E(Y|X)|X)dP=int_AE(Y|X)dP=int_AYdP,$$
                with first equality by applying property 2) to the random variable that results from taking the expectation of $E(Y|X)$, conditioning on $sigma(X)$; the second equality follows from applying property 2) to the random variable that results from taking the expectation of $Y$, conditioning on $sigma(X)$.






                share|cite|improve this answer









                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  As commenters point out, a definition would clear things up. Rick Durrett's Probability (free online) presents $E(X|mathcal F)$ as the random variable $Z$ so that



                  1) $Zinmathcal F$, and



                  2) $int_A XdP=int_AZdP$ for all $Ainmathcal F$.



                  To prove your fact, we have to show that



                  1) $E(E(Y|X)|X)in sigma(X)$, and



                  2) $int_AE(E(Y|X)|X)dP=int_AYdP$.



                  The first part follows from the fact that $E(E(Y|X)|X)$ is a random variable that results from conditioning on $sigma(X)$. ($E(Y|X)$ means exactly $E(Y|sigma(X))$.)



                  The second part follows from repeatedly applying property 2) for conditional expected values: For all $Ainsigma(X)$,



                  $$int_AE(E(Y|X)|X)dP=int_AE(Y|X)dP=int_AYdP,$$
                  with first equality by applying property 2) to the random variable that results from taking the expectation of $E(Y|X)$, conditioning on $sigma(X)$; the second equality follows from applying property 2) to the random variable that results from taking the expectation of $Y$, conditioning on $sigma(X)$.






                  share|cite|improve this answer









                  $endgroup$



                  As commenters point out, a definition would clear things up. Rick Durrett's Probability (free online) presents $E(X|mathcal F)$ as the random variable $Z$ so that



                  1) $Zinmathcal F$, and



                  2) $int_A XdP=int_AZdP$ for all $Ainmathcal F$.



                  To prove your fact, we have to show that



                  1) $E(E(Y|X)|X)in sigma(X)$, and



                  2) $int_AE(E(Y|X)|X)dP=int_AYdP$.



                  The first part follows from the fact that $E(E(Y|X)|X)$ is a random variable that results from conditioning on $sigma(X)$. ($E(Y|X)$ means exactly $E(Y|sigma(X))$.)



                  The second part follows from repeatedly applying property 2) for conditional expected values: For all $Ainsigma(X)$,



                  $$int_AE(E(Y|X)|X)dP=int_AE(Y|X)dP=int_AYdP,$$
                  with first equality by applying property 2) to the random variable that results from taking the expectation of $E(Y|X)$, conditioning on $sigma(X)$; the second equality follows from applying property 2) to the random variable that results from taking the expectation of $Y$, conditioning on $sigma(X)$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 9 '18 at 15:06









                  manofbearmanofbear

                  1,579515




                  1,579515






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3032383%2fusing-adams-law-law-of-total-expectation-to-find-expectation-of-residual%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Wiesbaden

                      Marschland

                      Dieringhausen