Expected value of order statistics $X_{(i+1)}$ conditional on $X_{(i)} < t$











up vote
1
down vote

favorite












Consider $N$ random variables $X_1, X_2, ldots, X_N$ that are i.i.d. distributed according to some cumulative distribution function $F$. Assume we receive a signal that says that $n$ number of the random variables will have values above some threshold $t$ (however we don't know which). To ease notation let $S_A$ denote this subset of random variables, and let $S_B$ denote the remaining $N-n$ variables. Let $g(S_A) = min(S_A)$ be the 1st order statistics of $S_A$.



1) What is the conditional expected value of $g(S_A)$?
$$mathbb{E}[g(S_A)|t,n]$$



I know that the pdf and expected value corresponding to the 1st order statistics of the entire set, i.e. $X_{(1)} = min(X_1, X_2, ldots, X_N)$, is respectively
$$f_{X_{(1)}}(x) = N(1-F(x))^{N-1}f(x)$$
$$mathbb{E}[X_{(1)}] = N int_{-infty}^infty x left(1 - F(x)right)^{N-1} f(x) dx$$
Setting $N=n$ in the equation above would not give $mathbb{E}[g(S_A)|t,n]$, since I haven't taken account of the fact that the lowest $N-n$ random variables have values below $t$. I think I need something like
$$mathbb{E}[g(S_A)|t,n] = mathbb{E}[X_{(N-n+1)}| X_{(N-n)} < t]$$



Furthermore let $h(S_B) = h(|S_B|) = h(N-n)$ be a linear function of the size of $S_B$.



2) What is the conditional expected value of $g(S_A)h(S_B)?$
$$mathbb{E}[g(S_A)h(S_B)|t,n]$$
For general functions $g$ and $h$, $mathbb{E}[g(S_A)h(S_B)|t,n] ne mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$, since $S_A$ and $S_B$ can be considered dependent random variables. But is it the case that $mathbb{E}[g(S_A)h(S_B)|t,n] = mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$ when $h$ is a function of the size of $S_B$?










share|cite|improve this question
























  • For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
    – leonbloy
    Nov 14 at 18:34












  • @leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
    – bonna
    Nov 14 at 18:54










  • Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
    – leonbloy
    Nov 14 at 18:58










  • @leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
    – bonna
    Nov 14 at 19:09















up vote
1
down vote

favorite












Consider $N$ random variables $X_1, X_2, ldots, X_N$ that are i.i.d. distributed according to some cumulative distribution function $F$. Assume we receive a signal that says that $n$ number of the random variables will have values above some threshold $t$ (however we don't know which). To ease notation let $S_A$ denote this subset of random variables, and let $S_B$ denote the remaining $N-n$ variables. Let $g(S_A) = min(S_A)$ be the 1st order statistics of $S_A$.



1) What is the conditional expected value of $g(S_A)$?
$$mathbb{E}[g(S_A)|t,n]$$



I know that the pdf and expected value corresponding to the 1st order statistics of the entire set, i.e. $X_{(1)} = min(X_1, X_2, ldots, X_N)$, is respectively
$$f_{X_{(1)}}(x) = N(1-F(x))^{N-1}f(x)$$
$$mathbb{E}[X_{(1)}] = N int_{-infty}^infty x left(1 - F(x)right)^{N-1} f(x) dx$$
Setting $N=n$ in the equation above would not give $mathbb{E}[g(S_A)|t,n]$, since I haven't taken account of the fact that the lowest $N-n$ random variables have values below $t$. I think I need something like
$$mathbb{E}[g(S_A)|t,n] = mathbb{E}[X_{(N-n+1)}| X_{(N-n)} < t]$$



Furthermore let $h(S_B) = h(|S_B|) = h(N-n)$ be a linear function of the size of $S_B$.



2) What is the conditional expected value of $g(S_A)h(S_B)?$
$$mathbb{E}[g(S_A)h(S_B)|t,n]$$
For general functions $g$ and $h$, $mathbb{E}[g(S_A)h(S_B)|t,n] ne mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$, since $S_A$ and $S_B$ can be considered dependent random variables. But is it the case that $mathbb{E}[g(S_A)h(S_B)|t,n] = mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$ when $h$ is a function of the size of $S_B$?










share|cite|improve this question
























  • For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
    – leonbloy
    Nov 14 at 18:34












  • @leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
    – bonna
    Nov 14 at 18:54










  • Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
    – leonbloy
    Nov 14 at 18:58










  • @leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
    – bonna
    Nov 14 at 19:09













up vote
1
down vote

favorite









up vote
1
down vote

favorite











Consider $N$ random variables $X_1, X_2, ldots, X_N$ that are i.i.d. distributed according to some cumulative distribution function $F$. Assume we receive a signal that says that $n$ number of the random variables will have values above some threshold $t$ (however we don't know which). To ease notation let $S_A$ denote this subset of random variables, and let $S_B$ denote the remaining $N-n$ variables. Let $g(S_A) = min(S_A)$ be the 1st order statistics of $S_A$.



1) What is the conditional expected value of $g(S_A)$?
$$mathbb{E}[g(S_A)|t,n]$$



I know that the pdf and expected value corresponding to the 1st order statistics of the entire set, i.e. $X_{(1)} = min(X_1, X_2, ldots, X_N)$, is respectively
$$f_{X_{(1)}}(x) = N(1-F(x))^{N-1}f(x)$$
$$mathbb{E}[X_{(1)}] = N int_{-infty}^infty x left(1 - F(x)right)^{N-1} f(x) dx$$
Setting $N=n$ in the equation above would not give $mathbb{E}[g(S_A)|t,n]$, since I haven't taken account of the fact that the lowest $N-n$ random variables have values below $t$. I think I need something like
$$mathbb{E}[g(S_A)|t,n] = mathbb{E}[X_{(N-n+1)}| X_{(N-n)} < t]$$



Furthermore let $h(S_B) = h(|S_B|) = h(N-n)$ be a linear function of the size of $S_B$.



2) What is the conditional expected value of $g(S_A)h(S_B)?$
$$mathbb{E}[g(S_A)h(S_B)|t,n]$$
For general functions $g$ and $h$, $mathbb{E}[g(S_A)h(S_B)|t,n] ne mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$, since $S_A$ and $S_B$ can be considered dependent random variables. But is it the case that $mathbb{E}[g(S_A)h(S_B)|t,n] = mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$ when $h$ is a function of the size of $S_B$?










share|cite|improve this question















Consider $N$ random variables $X_1, X_2, ldots, X_N$ that are i.i.d. distributed according to some cumulative distribution function $F$. Assume we receive a signal that says that $n$ number of the random variables will have values above some threshold $t$ (however we don't know which). To ease notation let $S_A$ denote this subset of random variables, and let $S_B$ denote the remaining $N-n$ variables. Let $g(S_A) = min(S_A)$ be the 1st order statistics of $S_A$.



1) What is the conditional expected value of $g(S_A)$?
$$mathbb{E}[g(S_A)|t,n]$$



I know that the pdf and expected value corresponding to the 1st order statistics of the entire set, i.e. $X_{(1)} = min(X_1, X_2, ldots, X_N)$, is respectively
$$f_{X_{(1)}}(x) = N(1-F(x))^{N-1}f(x)$$
$$mathbb{E}[X_{(1)}] = N int_{-infty}^infty x left(1 - F(x)right)^{N-1} f(x) dx$$
Setting $N=n$ in the equation above would not give $mathbb{E}[g(S_A)|t,n]$, since I haven't taken account of the fact that the lowest $N-n$ random variables have values below $t$. I think I need something like
$$mathbb{E}[g(S_A)|t,n] = mathbb{E}[X_{(N-n+1)}| X_{(N-n)} < t]$$



Furthermore let $h(S_B) = h(|S_B|) = h(N-n)$ be a linear function of the size of $S_B$.



2) What is the conditional expected value of $g(S_A)h(S_B)?$
$$mathbb{E}[g(S_A)h(S_B)|t,n]$$
For general functions $g$ and $h$, $mathbb{E}[g(S_A)h(S_B)|t,n] ne mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$, since $S_A$ and $S_B$ can be considered dependent random variables. But is it the case that $mathbb{E}[g(S_A)h(S_B)|t,n] = mathbb{E}[g(S_A)|t,n] times mathbb{E}[h(S_B)|t,n]$ when $h$ is a function of the size of $S_B$?







probability statistics conditional-expectation order-statistics






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 14 at 18:05

























asked Nov 9 at 19:29









bonna

858




858












  • For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
    – leonbloy
    Nov 14 at 18:34












  • @leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
    – bonna
    Nov 14 at 18:54










  • Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
    – leonbloy
    Nov 14 at 18:58










  • @leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
    – bonna
    Nov 14 at 19:09


















  • For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
    – leonbloy
    Nov 14 at 18:34












  • @leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
    – bonna
    Nov 14 at 18:54










  • Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
    – leonbloy
    Nov 14 at 18:58










  • @leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
    – bonna
    Nov 14 at 19:09
















For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
– leonbloy
Nov 14 at 18:34






For given $t$, $h(S_B)=h(N-n)=h(t)$ seems to be a deterministic value, no? I so, then it goes outside the conditional expectation, and we are left with $mathbb{E}[g(S_A)|t,n]$ Or am I missing something?
– leonbloy
Nov 14 at 18:34














@leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
– bonna
Nov 14 at 18:54




@leonbloy $t$ is the threshold, while $N-n$ is the size of $S_B$. But, you might be right that when conditioning on $n$, then $h(N-n)$ can be considered deterministic, thus moved outside the expectation.
– bonna
Nov 14 at 18:54












Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
– leonbloy
Nov 14 at 18:58




Yes, sorry about the confusion in notation. Anyway, my point applies. If $h$ (conditioned) is deterministic, then the problem is way simpler than stated - actually it reduced to point 1), right?
– leonbloy
Nov 14 at 18:58












@leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
– bonna
Nov 14 at 19:09




@leonbloy: Yes. Regarding 1), with theorem 2.4.1 in Arnord, Balakrishnan, Nagaraja (2008) I can calculate $mathbb{E}[X_{(N-n+1)}| X_{(N-n)} = t] = int_t^infty left[ x frac{n!}{(n-1)!} left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)} right]dx$
– bonna
Nov 14 at 19:09










1 Answer
1






active

oldest

votes

















up vote
2
down vote



accepted
+50










We are told that exactly $n$ rvs have a value greater than $t$. It's clear (perhaps not so much?) that the statistic of those $n$ variables are only affected by the truncation (but they are still independent). Then, the result for the 1st order statistic applies to the truncated distributions.



Let $G(x)$ be cumulative density of the $n$ truncated variables, with $x> t$. Then $$G(x) = frac{F(x)-F(t)}{1-F(t)}$$



(Here, and at what follows, we are implicitly assuming conditioning on $n,t$).



Letting $A(x)$ be the CDF of the minimum, we get



$$A(x)= 1 - (1-G(x))^n=1 - left(1-frac{F(x)-F(t)}{1-F(t)}right)^n=1 - left(frac{1-F(x)}{1-F(t)}right)^n$$



From this you can readily compute the expectation and solve point 1).



$$mathbb{E}[g(S_A)|t,n] = int_t^infty left[x a(x)right] dx = int_t^infty left[x n left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)}right] dx$$



The rest is rather trivial, because $h()$ conditioned on $(n,t)$ is deterministic, hence it goes outside the expectation.






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2991832%2fexpected-value-of-order-statistics-x-i1-conditional-on-x-i-t%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    2
    down vote



    accepted
    +50










    We are told that exactly $n$ rvs have a value greater than $t$. It's clear (perhaps not so much?) that the statistic of those $n$ variables are only affected by the truncation (but they are still independent). Then, the result for the 1st order statistic applies to the truncated distributions.



    Let $G(x)$ be cumulative density of the $n$ truncated variables, with $x> t$. Then $$G(x) = frac{F(x)-F(t)}{1-F(t)}$$



    (Here, and at what follows, we are implicitly assuming conditioning on $n,t$).



    Letting $A(x)$ be the CDF of the minimum, we get



    $$A(x)= 1 - (1-G(x))^n=1 - left(1-frac{F(x)-F(t)}{1-F(t)}right)^n=1 - left(frac{1-F(x)}{1-F(t)}right)^n$$



    From this you can readily compute the expectation and solve point 1).



    $$mathbb{E}[g(S_A)|t,n] = int_t^infty left[x a(x)right] dx = int_t^infty left[x n left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)}right] dx$$



    The rest is rather trivial, because $h()$ conditioned on $(n,t)$ is deterministic, hence it goes outside the expectation.






    share|cite|improve this answer



























      up vote
      2
      down vote



      accepted
      +50










      We are told that exactly $n$ rvs have a value greater than $t$. It's clear (perhaps not so much?) that the statistic of those $n$ variables are only affected by the truncation (but they are still independent). Then, the result for the 1st order statistic applies to the truncated distributions.



      Let $G(x)$ be cumulative density of the $n$ truncated variables, with $x> t$. Then $$G(x) = frac{F(x)-F(t)}{1-F(t)}$$



      (Here, and at what follows, we are implicitly assuming conditioning on $n,t$).



      Letting $A(x)$ be the CDF of the minimum, we get



      $$A(x)= 1 - (1-G(x))^n=1 - left(1-frac{F(x)-F(t)}{1-F(t)}right)^n=1 - left(frac{1-F(x)}{1-F(t)}right)^n$$



      From this you can readily compute the expectation and solve point 1).



      $$mathbb{E}[g(S_A)|t,n] = int_t^infty left[x a(x)right] dx = int_t^infty left[x n left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)}right] dx$$



      The rest is rather trivial, because $h()$ conditioned on $(n,t)$ is deterministic, hence it goes outside the expectation.






      share|cite|improve this answer

























        up vote
        2
        down vote



        accepted
        +50







        up vote
        2
        down vote



        accepted
        +50




        +50




        We are told that exactly $n$ rvs have a value greater than $t$. It's clear (perhaps not so much?) that the statistic of those $n$ variables are only affected by the truncation (but they are still independent). Then, the result for the 1st order statistic applies to the truncated distributions.



        Let $G(x)$ be cumulative density of the $n$ truncated variables, with $x> t$. Then $$G(x) = frac{F(x)-F(t)}{1-F(t)}$$



        (Here, and at what follows, we are implicitly assuming conditioning on $n,t$).



        Letting $A(x)$ be the CDF of the minimum, we get



        $$A(x)= 1 - (1-G(x))^n=1 - left(1-frac{F(x)-F(t)}{1-F(t)}right)^n=1 - left(frac{1-F(x)}{1-F(t)}right)^n$$



        From this you can readily compute the expectation and solve point 1).



        $$mathbb{E}[g(S_A)|t,n] = int_t^infty left[x a(x)right] dx = int_t^infty left[x n left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)}right] dx$$



        The rest is rather trivial, because $h()$ conditioned on $(n,t)$ is deterministic, hence it goes outside the expectation.






        share|cite|improve this answer














        We are told that exactly $n$ rvs have a value greater than $t$. It's clear (perhaps not so much?) that the statistic of those $n$ variables are only affected by the truncation (but they are still independent). Then, the result for the 1st order statistic applies to the truncated distributions.



        Let $G(x)$ be cumulative density of the $n$ truncated variables, with $x> t$. Then $$G(x) = frac{F(x)-F(t)}{1-F(t)}$$



        (Here, and at what follows, we are implicitly assuming conditioning on $n,t$).



        Letting $A(x)$ be the CDF of the minimum, we get



        $$A(x)= 1 - (1-G(x))^n=1 - left(1-frac{F(x)-F(t)}{1-F(t)}right)^n=1 - left(frac{1-F(x)}{1-F(t)}right)^n$$



        From this you can readily compute the expectation and solve point 1).



        $$mathbb{E}[g(S_A)|t,n] = int_t^infty left[x a(x)right] dx = int_t^infty left[x n left(frac{1-F(x)}{1-F(t)}right)^{n-1} frac{f(x)}{1-F(t)}right] dx$$



        The rest is rather trivial, because $h()$ conditioned on $(n,t)$ is deterministic, hence it goes outside the expectation.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 21 at 10:21









        bonna

        858




        858










        answered Nov 14 at 19:12









        leonbloy

        39.7k645105




        39.7k645105






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2991832%2fexpected-value-of-order-statistics-x-i1-conditional-on-x-i-t%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Wiesbaden

            Marschland

            Dieringhausen