Compound Binomial - Exponential process












1












$begingroup$


Problem:



I have a sum of $N$ random variables $X_i$, where $N$ is distributed according to a binomial distribution and the $X_i$ are independent and identically distributed according to an exponential distribution.
I would like to get the pdf of the sum



$$ Y_N = X_1 + X_2 + dots + X_N. $$



If I write the pmf of $N$ as
$$ p_n = {N choose n} p^n q^{N-n}$$
and the pdf of the $X_i$ as
$$ f(x) = lambda e^{-lambda x},$$
How can I get the pdf $p(y)$ of $Y_N$?



Attempt:



I gather this is called a compound process. I read about it somewhat in Feller: the section is "Sum of a Random Number of Random Variables", but he does not mix discrete and continuous random variables.
I think this is just a discrete time random walk where the steps are exponentially distributed.



I think I can condition something like
$$text{Prob}(Y_N = y) = sum_{n=0}^N text{Prob}(N=n)text{Prob}(X_1 + dots +X_n = y),$$
and this will become



$$p(y) = sum_{n=1}^N p_n f^{n*}(y),$$



but I'm not actually sure how to write the $n$-fold convolution of exponentials (although this should be a Gamma distribution I think).

So, first interjection: is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(x_1)f(x_2-x_1)f(x_3-x_2-x_1)dots f(y-x_n-dots x_1)?$$



or is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(y-x_n)f(x_n-x_{n-1})dots f(x_2-x_1)f(x_1)?$$



Then I know I can take a laplace transform to simplify the convolution (although I'm not sure what the convolution should look like exactly yet), to get something like



$$ p(s) = sum_{n=0}^N p_n f(s)^n$$



Does this make sense? Then this is the probability generating function of $p_n$ evaluated at $f(s)$:



$$ G(z) = langle z^n rangle|_{z=f(s)} = sum_{n=0}^N z^n p_n |_{z=f(s)}$$



So the mgf for $ p(y)$ is the pgf for $p_n$ evaluated at the mgf for f(x): Is this correct?



(using this link to keep my words straight)



Following through, the pgf of the binomial distribution $p_n$ is



$$ G(z) = sum_{i=0}^N {N choose i} (zp)^i q^{N-i} = (zp + q)^N,$$



(verified correct here)
and the mgf of the exponential distribution $f(x)$ is



$$ f(s) = int_0^infty dx lambda e^{(s-lambda)x} = frac{lambda}{lambda -s} $$



(verified correct here).
So the mgf for $p(y)$ will be



$$ p(s) = (pfrac{lambda}{lambda-s}+q)^N. $$



So for example the mean value of $Y$ should be



$$mu_Y = frac{d}{ds}(pfrac{lambda}{lambda-s}+q)^N|_{s=0} = N (pfrac{lambda}{lambda-s}+q)^{N-1} pfrac{lambda}{(lambda-s)^2}|_{s=0} = Nplambda^{-1}. $$
(Fixed an error thanks to Clement)



The probability distribution should be the inverse fourier transform of the characteristic function $p(s=ik)$:



$$p(y) = int_{-infty}^{infty} frac{dk}{2pi} e^{-iky}(frac{plambda}{lambda-ik}+p)^N$$



Any thoughts on this integral are appreciated !










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
    $endgroup$
    – NCh
    Dec 14 '18 at 1:24










  • $begingroup$
    @NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:49






  • 1




    $begingroup$
    Oh I see that now! I missed a minus. Thanks @ClementC.
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:55






  • 1




    $begingroup$
    You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
    $endgroup$
    – Clement C.
    Dec 14 '18 at 1:59






  • 1




    $begingroup$
    @kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
    $endgroup$
    – NCh
    Dec 14 '18 at 12:11
















1












$begingroup$


Problem:



I have a sum of $N$ random variables $X_i$, where $N$ is distributed according to a binomial distribution and the $X_i$ are independent and identically distributed according to an exponential distribution.
I would like to get the pdf of the sum



$$ Y_N = X_1 + X_2 + dots + X_N. $$



If I write the pmf of $N$ as
$$ p_n = {N choose n} p^n q^{N-n}$$
and the pdf of the $X_i$ as
$$ f(x) = lambda e^{-lambda x},$$
How can I get the pdf $p(y)$ of $Y_N$?



Attempt:



I gather this is called a compound process. I read about it somewhat in Feller: the section is "Sum of a Random Number of Random Variables", but he does not mix discrete and continuous random variables.
I think this is just a discrete time random walk where the steps are exponentially distributed.



I think I can condition something like
$$text{Prob}(Y_N = y) = sum_{n=0}^N text{Prob}(N=n)text{Prob}(X_1 + dots +X_n = y),$$
and this will become



$$p(y) = sum_{n=1}^N p_n f^{n*}(y),$$



but I'm not actually sure how to write the $n$-fold convolution of exponentials (although this should be a Gamma distribution I think).

So, first interjection: is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(x_1)f(x_2-x_1)f(x_3-x_2-x_1)dots f(y-x_n-dots x_1)?$$



or is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(y-x_n)f(x_n-x_{n-1})dots f(x_2-x_1)f(x_1)?$$



Then I know I can take a laplace transform to simplify the convolution (although I'm not sure what the convolution should look like exactly yet), to get something like



$$ p(s) = sum_{n=0}^N p_n f(s)^n$$



Does this make sense? Then this is the probability generating function of $p_n$ evaluated at $f(s)$:



$$ G(z) = langle z^n rangle|_{z=f(s)} = sum_{n=0}^N z^n p_n |_{z=f(s)}$$



So the mgf for $ p(y)$ is the pgf for $p_n$ evaluated at the mgf for f(x): Is this correct?



(using this link to keep my words straight)



Following through, the pgf of the binomial distribution $p_n$ is



$$ G(z) = sum_{i=0}^N {N choose i} (zp)^i q^{N-i} = (zp + q)^N,$$



(verified correct here)
and the mgf of the exponential distribution $f(x)$ is



$$ f(s) = int_0^infty dx lambda e^{(s-lambda)x} = frac{lambda}{lambda -s} $$



(verified correct here).
So the mgf for $p(y)$ will be



$$ p(s) = (pfrac{lambda}{lambda-s}+q)^N. $$



So for example the mean value of $Y$ should be



$$mu_Y = frac{d}{ds}(pfrac{lambda}{lambda-s}+q)^N|_{s=0} = N (pfrac{lambda}{lambda-s}+q)^{N-1} pfrac{lambda}{(lambda-s)^2}|_{s=0} = Nplambda^{-1}. $$
(Fixed an error thanks to Clement)



The probability distribution should be the inverse fourier transform of the characteristic function $p(s=ik)$:



$$p(y) = int_{-infty}^{infty} frac{dk}{2pi} e^{-iky}(frac{plambda}{lambda-ik}+p)^N$$



Any thoughts on this integral are appreciated !










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
    $endgroup$
    – NCh
    Dec 14 '18 at 1:24










  • $begingroup$
    @NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:49






  • 1




    $begingroup$
    Oh I see that now! I missed a minus. Thanks @ClementC.
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:55






  • 1




    $begingroup$
    You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
    $endgroup$
    – Clement C.
    Dec 14 '18 at 1:59






  • 1




    $begingroup$
    @kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
    $endgroup$
    – NCh
    Dec 14 '18 at 12:11














1












1








1





$begingroup$


Problem:



I have a sum of $N$ random variables $X_i$, where $N$ is distributed according to a binomial distribution and the $X_i$ are independent and identically distributed according to an exponential distribution.
I would like to get the pdf of the sum



$$ Y_N = X_1 + X_2 + dots + X_N. $$



If I write the pmf of $N$ as
$$ p_n = {N choose n} p^n q^{N-n}$$
and the pdf of the $X_i$ as
$$ f(x) = lambda e^{-lambda x},$$
How can I get the pdf $p(y)$ of $Y_N$?



Attempt:



I gather this is called a compound process. I read about it somewhat in Feller: the section is "Sum of a Random Number of Random Variables", but he does not mix discrete and continuous random variables.
I think this is just a discrete time random walk where the steps are exponentially distributed.



I think I can condition something like
$$text{Prob}(Y_N = y) = sum_{n=0}^N text{Prob}(N=n)text{Prob}(X_1 + dots +X_n = y),$$
and this will become



$$p(y) = sum_{n=1}^N p_n f^{n*}(y),$$



but I'm not actually sure how to write the $n$-fold convolution of exponentials (although this should be a Gamma distribution I think).

So, first interjection: is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(x_1)f(x_2-x_1)f(x_3-x_2-x_1)dots f(y-x_n-dots x_1)?$$



or is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(y-x_n)f(x_n-x_{n-1})dots f(x_2-x_1)f(x_1)?$$



Then I know I can take a laplace transform to simplify the convolution (although I'm not sure what the convolution should look like exactly yet), to get something like



$$ p(s) = sum_{n=0}^N p_n f(s)^n$$



Does this make sense? Then this is the probability generating function of $p_n$ evaluated at $f(s)$:



$$ G(z) = langle z^n rangle|_{z=f(s)} = sum_{n=0}^N z^n p_n |_{z=f(s)}$$



So the mgf for $ p(y)$ is the pgf for $p_n$ evaluated at the mgf for f(x): Is this correct?



(using this link to keep my words straight)



Following through, the pgf of the binomial distribution $p_n$ is



$$ G(z) = sum_{i=0}^N {N choose i} (zp)^i q^{N-i} = (zp + q)^N,$$



(verified correct here)
and the mgf of the exponential distribution $f(x)$ is



$$ f(s) = int_0^infty dx lambda e^{(s-lambda)x} = frac{lambda}{lambda -s} $$



(verified correct here).
So the mgf for $p(y)$ will be



$$ p(s) = (pfrac{lambda}{lambda-s}+q)^N. $$



So for example the mean value of $Y$ should be



$$mu_Y = frac{d}{ds}(pfrac{lambda}{lambda-s}+q)^N|_{s=0} = N (pfrac{lambda}{lambda-s}+q)^{N-1} pfrac{lambda}{(lambda-s)^2}|_{s=0} = Nplambda^{-1}. $$
(Fixed an error thanks to Clement)



The probability distribution should be the inverse fourier transform of the characteristic function $p(s=ik)$:



$$p(y) = int_{-infty}^{infty} frac{dk}{2pi} e^{-iky}(frac{plambda}{lambda-ik}+p)^N$$



Any thoughts on this integral are appreciated !










share|cite|improve this question











$endgroup$




Problem:



I have a sum of $N$ random variables $X_i$, where $N$ is distributed according to a binomial distribution and the $X_i$ are independent and identically distributed according to an exponential distribution.
I would like to get the pdf of the sum



$$ Y_N = X_1 + X_2 + dots + X_N. $$



If I write the pmf of $N$ as
$$ p_n = {N choose n} p^n q^{N-n}$$
and the pdf of the $X_i$ as
$$ f(x) = lambda e^{-lambda x},$$
How can I get the pdf $p(y)$ of $Y_N$?



Attempt:



I gather this is called a compound process. I read about it somewhat in Feller: the section is "Sum of a Random Number of Random Variables", but he does not mix discrete and continuous random variables.
I think this is just a discrete time random walk where the steps are exponentially distributed.



I think I can condition something like
$$text{Prob}(Y_N = y) = sum_{n=0}^N text{Prob}(N=n)text{Prob}(X_1 + dots +X_n = y),$$
and this will become



$$p(y) = sum_{n=1}^N p_n f^{n*}(y),$$



but I'm not actually sure how to write the $n$-fold convolution of exponentials (although this should be a Gamma distribution I think).

So, first interjection: is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(x_1)f(x_2-x_1)f(x_3-x_2-x_1)dots f(y-x_n-dots x_1)?$$



or is it this?



$$ f^{n*}(y) = int dx_1dots dx_n f(y-x_n)f(x_n-x_{n-1})dots f(x_2-x_1)f(x_1)?$$



Then I know I can take a laplace transform to simplify the convolution (although I'm not sure what the convolution should look like exactly yet), to get something like



$$ p(s) = sum_{n=0}^N p_n f(s)^n$$



Does this make sense? Then this is the probability generating function of $p_n$ evaluated at $f(s)$:



$$ G(z) = langle z^n rangle|_{z=f(s)} = sum_{n=0}^N z^n p_n |_{z=f(s)}$$



So the mgf for $ p(y)$ is the pgf for $p_n$ evaluated at the mgf for f(x): Is this correct?



(using this link to keep my words straight)



Following through, the pgf of the binomial distribution $p_n$ is



$$ G(z) = sum_{i=0}^N {N choose i} (zp)^i q^{N-i} = (zp + q)^N,$$



(verified correct here)
and the mgf of the exponential distribution $f(x)$ is



$$ f(s) = int_0^infty dx lambda e^{(s-lambda)x} = frac{lambda}{lambda -s} $$



(verified correct here).
So the mgf for $p(y)$ will be



$$ p(s) = (pfrac{lambda}{lambda-s}+q)^N. $$



So for example the mean value of $Y$ should be



$$mu_Y = frac{d}{ds}(pfrac{lambda}{lambda-s}+q)^N|_{s=0} = N (pfrac{lambda}{lambda-s}+q)^{N-1} pfrac{lambda}{(lambda-s)^2}|_{s=0} = Nplambda^{-1}. $$
(Fixed an error thanks to Clement)



The probability distribution should be the inverse fourier transform of the characteristic function $p(s=ik)$:



$$p(y) = int_{-infty}^{infty} frac{dk}{2pi} e^{-iky}(frac{plambda}{lambda-ik}+p)^N$$



Any thoughts on this integral are appreciated !







probability stochastic-processes random-walk






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 14 '18 at 19:23







kevinkayaks

















asked Dec 14 '18 at 0:58









kevinkayakskevinkayaks

1558




1558








  • 1




    $begingroup$
    This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
    $endgroup$
    – NCh
    Dec 14 '18 at 1:24










  • $begingroup$
    @NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:49






  • 1




    $begingroup$
    Oh I see that now! I missed a minus. Thanks @ClementC.
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:55






  • 1




    $begingroup$
    You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
    $endgroup$
    – Clement C.
    Dec 14 '18 at 1:59






  • 1




    $begingroup$
    @kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
    $endgroup$
    – NCh
    Dec 14 '18 at 12:11














  • 1




    $begingroup$
    This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
    $endgroup$
    – NCh
    Dec 14 '18 at 1:24










  • $begingroup$
    @NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:49






  • 1




    $begingroup$
    Oh I see that now! I missed a minus. Thanks @ClementC.
    $endgroup$
    – kevinkayaks
    Dec 14 '18 at 1:55






  • 1




    $begingroup$
    You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
    $endgroup$
    – Clement C.
    Dec 14 '18 at 1:59






  • 1




    $begingroup$
    @kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
    $endgroup$
    – NCh
    Dec 14 '18 at 12:11








1




1




$begingroup$
This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
$endgroup$
– NCh
Dec 14 '18 at 1:24




$begingroup$
This is correct. This is the simplest way - to find mgf of $Y_N$ from pgf of $N$.
$endgroup$
– NCh
Dec 14 '18 at 1:24












$begingroup$
@NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
$endgroup$
– kevinkayaks
Dec 14 '18 at 1:49




$begingroup$
@NCh can you comment on the proper way for me to write the convolution integral of $N$ distributions? And any idea why I get a negative mean value for Y?
$endgroup$
– kevinkayaks
Dec 14 '18 at 1:49




1




1




$begingroup$
Oh I see that now! I missed a minus. Thanks @ClementC.
$endgroup$
– kevinkayaks
Dec 14 '18 at 1:55




$begingroup$
Oh I see that now! I missed a minus. Thanks @ClementC.
$endgroup$
– kevinkayaks
Dec 14 '18 at 1:55




1




1




$begingroup$
You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
$endgroup$
– Clement C.
Dec 14 '18 at 1:59




$begingroup$
You can check that the expectation is correct: $$mathbb{E}[Y_N]= mathbb{E}[mathbb{E}[Y_Nmid N]]= mathbb{E}[mathbb{E}[sum_{k=1}^N X_kmid N]]= mathbb{E}[Ncdot frac{1}{lambda}] = frac{np}{lambda}$$
$endgroup$
– Clement C.
Dec 14 '18 at 1:59




1




1




$begingroup$
@kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
$endgroup$
– NCh
Dec 14 '18 at 12:11




$begingroup$
@kevinkayaks $f^{n*}(y)=iint f(x_1)f(x_2)ldots f(x_{n-1}) f(y-x_1-x_2-ldots-x_{n-1}),dx_1,dx_2ldots dx_{n-1}$. And you can prove that sum of i.i.d. exponential r.v.'s is Gamma distributed without n-fold convolution, only from convolution of two pdf's, by induction.
$endgroup$
– NCh
Dec 14 '18 at 12:11










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038793%2fcompound-binomial-exponential-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3038793%2fcompound-binomial-exponential-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

To store a contact into the json file from server.js file using a class in NodeJS

Redirect URL with Chrome Remote Debugging Android Devices

Dieringhausen