A probability inequality: probability that the normalized sum of i.i.d. random variables is bounded below, is...
$begingroup$
I have been told that the following fact is true. Let $X_1,X_2,X_3,dots$ be i.i.d. random variables. Then there exists $epsilon$ such that for all $n$,
$$mathbb{P}left(frac{|X_1+dots+X_n|}{sqrt{n}}geq epsilonright)geq delta.$$
Observe that $epsilon$ does not depend on $n$. Both $epsilon$ and $delta$ are $>0$.
However, I am struggling to prove this, or find any reference. The kicker is that the $X_i$'s need not have finite mean or variance. In fact, I am interested in applying this "fact" to a situation where the $X_i$'s must have infinite variance (but possibly zero mean), so elementary things like Markov or Chebyshev's inequality won't help. I am unsure on how to proceed. Any hint would be greatly appreciated!
Update on progress: For the $X_i$ that I am interested in, I have deduced the condition
$$X_1+X_2+dots+X_{2^k} sim2^{k/4}X_i.$$
I also have proved the inequality
$$mathbb{P}(|S_n|>t) geq frac{1}{2}mathbb{P}(max_j|X_j|>t)geqfrac{1}{2}(1-e^{-n(1-F(t)+F(-t))}),$$
where $F$ is the c.d.f. of $X_i$. By $S_n$, I mean the $S_n=X_1+dots+X_n$. Thus it seems like the issue boils down to analyzing the distribution of $X_i$.
real-analysis probability probability-theory statistics probability-limit-theorems
$endgroup$
add a comment |
$begingroup$
I have been told that the following fact is true. Let $X_1,X_2,X_3,dots$ be i.i.d. random variables. Then there exists $epsilon$ such that for all $n$,
$$mathbb{P}left(frac{|X_1+dots+X_n|}{sqrt{n}}geq epsilonright)geq delta.$$
Observe that $epsilon$ does not depend on $n$. Both $epsilon$ and $delta$ are $>0$.
However, I am struggling to prove this, or find any reference. The kicker is that the $X_i$'s need not have finite mean or variance. In fact, I am interested in applying this "fact" to a situation where the $X_i$'s must have infinite variance (but possibly zero mean), so elementary things like Markov or Chebyshev's inequality won't help. I am unsure on how to proceed. Any hint would be greatly appreciated!
Update on progress: For the $X_i$ that I am interested in, I have deduced the condition
$$X_1+X_2+dots+X_{2^k} sim2^{k/4}X_i.$$
I also have proved the inequality
$$mathbb{P}(|S_n|>t) geq frac{1}{2}mathbb{P}(max_j|X_j|>t)geqfrac{1}{2}(1-e^{-n(1-F(t)+F(-t))}),$$
where $F$ is the c.d.f. of $X_i$. By $S_n$, I mean the $S_n=X_1+dots+X_n$. Thus it seems like the issue boils down to analyzing the distribution of $X_i$.
real-analysis probability probability-theory statistics probability-limit-theorems
$endgroup$
1
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56
add a comment |
$begingroup$
I have been told that the following fact is true. Let $X_1,X_2,X_3,dots$ be i.i.d. random variables. Then there exists $epsilon$ such that for all $n$,
$$mathbb{P}left(frac{|X_1+dots+X_n|}{sqrt{n}}geq epsilonright)geq delta.$$
Observe that $epsilon$ does not depend on $n$. Both $epsilon$ and $delta$ are $>0$.
However, I am struggling to prove this, or find any reference. The kicker is that the $X_i$'s need not have finite mean or variance. In fact, I am interested in applying this "fact" to a situation where the $X_i$'s must have infinite variance (but possibly zero mean), so elementary things like Markov or Chebyshev's inequality won't help. I am unsure on how to proceed. Any hint would be greatly appreciated!
Update on progress: For the $X_i$ that I am interested in, I have deduced the condition
$$X_1+X_2+dots+X_{2^k} sim2^{k/4}X_i.$$
I also have proved the inequality
$$mathbb{P}(|S_n|>t) geq frac{1}{2}mathbb{P}(max_j|X_j|>t)geqfrac{1}{2}(1-e^{-n(1-F(t)+F(-t))}),$$
where $F$ is the c.d.f. of $X_i$. By $S_n$, I mean the $S_n=X_1+dots+X_n$. Thus it seems like the issue boils down to analyzing the distribution of $X_i$.
real-analysis probability probability-theory statistics probability-limit-theorems
$endgroup$
I have been told that the following fact is true. Let $X_1,X_2,X_3,dots$ be i.i.d. random variables. Then there exists $epsilon$ such that for all $n$,
$$mathbb{P}left(frac{|X_1+dots+X_n|}{sqrt{n}}geq epsilonright)geq delta.$$
Observe that $epsilon$ does not depend on $n$. Both $epsilon$ and $delta$ are $>0$.
However, I am struggling to prove this, or find any reference. The kicker is that the $X_i$'s need not have finite mean or variance. In fact, I am interested in applying this "fact" to a situation where the $X_i$'s must have infinite variance (but possibly zero mean), so elementary things like Markov or Chebyshev's inequality won't help. I am unsure on how to proceed. Any hint would be greatly appreciated!
Update on progress: For the $X_i$ that I am interested in, I have deduced the condition
$$X_1+X_2+dots+X_{2^k} sim2^{k/4}X_i.$$
I also have proved the inequality
$$mathbb{P}(|S_n|>t) geq frac{1}{2}mathbb{P}(max_j|X_j|>t)geqfrac{1}{2}(1-e^{-n(1-F(t)+F(-t))}),$$
where $F$ is the c.d.f. of $X_i$. By $S_n$, I mean the $S_n=X_1+dots+X_n$. Thus it seems like the issue boils down to analyzing the distribution of $X_i$.
real-analysis probability probability-theory statistics probability-limit-theorems
real-analysis probability probability-theory statistics probability-limit-theorems
edited Dec 5 '18 at 4:19
Steve L
asked Dec 4 '18 at 22:59
Steve LSteve L
985
985
1
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56
add a comment |
1
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56
1
1
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Let me give a direct proof using characteristic functions. The setting is as follows:
$(X_n)$ and $(X'_n)$ are i.i.d.
$tilde{X}_n = X_n - X'_n$ are symmetrized variables.
$S_n = X_1 + cdots + X_n$ and $tilde{S}_n = tilde{X}_1 + cdots + tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $epsilon > 0$ such that
$$ inf_{ngeq 1} mathbb{P}left( |S_n| geq epsilonsqrt{n} right) > 0. $$
We prove the contraposition. To this end, assume that $inf_n mathbb{P}left(|S_n|geq epsilon sqrt{n}right) = 0$ for any $epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/sqrt{n_k} to 0$ in probability. This implies that $tilde{S}_{n_k}/sqrt{n_k} to 0$ in probability as well. So, if $varphi(t) = mathbb{E}[cos(ttilde{X}_1)] $ denotes the characteristic funtion of $tilde{X}_1$, then
$$ varphileft( frac{t}{sqrt{n_k}} right)^{n_k} = mathbb{E}[exp{mathrm{i}t tilde{S}_{n_k}/sqrt{n_k}}] xrightarrow[ktoinfty]{} 1 $$
by the Portmanteau theorem. By taking $log|cdot|$, we have $n_k logleft| varphileft( frac{t}{sqrt{n_k}} right) right| to 0$. But since
$ varphileft( frac{t}{sqrt{n_k}} right) = 1 - 2 mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] $ by the double-angle identity;
$mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] to 0$ by the dominated convergence theorem;
it follows that
$$ n_k mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] xrightarrow[ktoinfty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ mathbb{E}[tilde{X}_1^2]
= lim_{ktoinfty} mathbb{E}left[ n_k sin^2left( frac{tilde{X}_1}{sqrt{n_k}}right) mathbf{1}_{{ |tilde{X}_1| leq frac{pi}{2}sqrt{n_k} }} right]
= 0, $$
and therefore $X_1$ is degenerate.
$endgroup$
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
add a comment |
$begingroup$
I think what we want to show is that $S_n/sqrt{n}$ does not converge in probability to 0.
If $X_i$ have finite mean and variance > 0, the result follows from the CLT. Thus, we only have to consider the case of infinite variance. In some sense, this should be even easier to prove since it's more likely that the sum $S_n=X_1+...+X_n$ is large. In fact, if $S_n/sqrt{n}$ converges in probability to 0, then $X_i$ must have finite variance. This was an exercise in Durrett's probability book. The idea is to symmetrize by considering the random variables $Y_i = X_i - X'_i$. Assume that $X_i$ have infinite variance. Then we can consider truncated versions of $Y_i$ with arbitrarily large finite variance. This then allows us to get a bound like $mathbb{P}(sum Y_i ge K sqrt{n}) ge 1/5$ for arbitrary $K$. (Essentially, if that probability is too small, you have no chance of obtaining the required large variance.) But the probability was supposed to go to 0 for $K > 0$. Thus, $X_i$ can be assumed to have finite variance and the CLT applies.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026308%2fa-probability-inequality-probability-that-the-normalized-sum-of-i-i-d-random-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let me give a direct proof using characteristic functions. The setting is as follows:
$(X_n)$ and $(X'_n)$ are i.i.d.
$tilde{X}_n = X_n - X'_n$ are symmetrized variables.
$S_n = X_1 + cdots + X_n$ and $tilde{S}_n = tilde{X}_1 + cdots + tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $epsilon > 0$ such that
$$ inf_{ngeq 1} mathbb{P}left( |S_n| geq epsilonsqrt{n} right) > 0. $$
We prove the contraposition. To this end, assume that $inf_n mathbb{P}left(|S_n|geq epsilon sqrt{n}right) = 0$ for any $epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/sqrt{n_k} to 0$ in probability. This implies that $tilde{S}_{n_k}/sqrt{n_k} to 0$ in probability as well. So, if $varphi(t) = mathbb{E}[cos(ttilde{X}_1)] $ denotes the characteristic funtion of $tilde{X}_1$, then
$$ varphileft( frac{t}{sqrt{n_k}} right)^{n_k} = mathbb{E}[exp{mathrm{i}t tilde{S}_{n_k}/sqrt{n_k}}] xrightarrow[ktoinfty]{} 1 $$
by the Portmanteau theorem. By taking $log|cdot|$, we have $n_k logleft| varphileft( frac{t}{sqrt{n_k}} right) right| to 0$. But since
$ varphileft( frac{t}{sqrt{n_k}} right) = 1 - 2 mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] $ by the double-angle identity;
$mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] to 0$ by the dominated convergence theorem;
it follows that
$$ n_k mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] xrightarrow[ktoinfty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ mathbb{E}[tilde{X}_1^2]
= lim_{ktoinfty} mathbb{E}left[ n_k sin^2left( frac{tilde{X}_1}{sqrt{n_k}}right) mathbf{1}_{{ |tilde{X}_1| leq frac{pi}{2}sqrt{n_k} }} right]
= 0, $$
and therefore $X_1$ is degenerate.
$endgroup$
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
add a comment |
$begingroup$
Let me give a direct proof using characteristic functions. The setting is as follows:
$(X_n)$ and $(X'_n)$ are i.i.d.
$tilde{X}_n = X_n - X'_n$ are symmetrized variables.
$S_n = X_1 + cdots + X_n$ and $tilde{S}_n = tilde{X}_1 + cdots + tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $epsilon > 0$ such that
$$ inf_{ngeq 1} mathbb{P}left( |S_n| geq epsilonsqrt{n} right) > 0. $$
We prove the contraposition. To this end, assume that $inf_n mathbb{P}left(|S_n|geq epsilon sqrt{n}right) = 0$ for any $epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/sqrt{n_k} to 0$ in probability. This implies that $tilde{S}_{n_k}/sqrt{n_k} to 0$ in probability as well. So, if $varphi(t) = mathbb{E}[cos(ttilde{X}_1)] $ denotes the characteristic funtion of $tilde{X}_1$, then
$$ varphileft( frac{t}{sqrt{n_k}} right)^{n_k} = mathbb{E}[exp{mathrm{i}t tilde{S}_{n_k}/sqrt{n_k}}] xrightarrow[ktoinfty]{} 1 $$
by the Portmanteau theorem. By taking $log|cdot|$, we have $n_k logleft| varphileft( frac{t}{sqrt{n_k}} right) right| to 0$. But since
$ varphileft( frac{t}{sqrt{n_k}} right) = 1 - 2 mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] $ by the double-angle identity;
$mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] to 0$ by the dominated convergence theorem;
it follows that
$$ n_k mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] xrightarrow[ktoinfty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ mathbb{E}[tilde{X}_1^2]
= lim_{ktoinfty} mathbb{E}left[ n_k sin^2left( frac{tilde{X}_1}{sqrt{n_k}}right) mathbf{1}_{{ |tilde{X}_1| leq frac{pi}{2}sqrt{n_k} }} right]
= 0, $$
and therefore $X_1$ is degenerate.
$endgroup$
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
add a comment |
$begingroup$
Let me give a direct proof using characteristic functions. The setting is as follows:
$(X_n)$ and $(X'_n)$ are i.i.d.
$tilde{X}_n = X_n - X'_n$ are symmetrized variables.
$S_n = X_1 + cdots + X_n$ and $tilde{S}_n = tilde{X}_1 + cdots + tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $epsilon > 0$ such that
$$ inf_{ngeq 1} mathbb{P}left( |S_n| geq epsilonsqrt{n} right) > 0. $$
We prove the contraposition. To this end, assume that $inf_n mathbb{P}left(|S_n|geq epsilon sqrt{n}right) = 0$ for any $epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/sqrt{n_k} to 0$ in probability. This implies that $tilde{S}_{n_k}/sqrt{n_k} to 0$ in probability as well. So, if $varphi(t) = mathbb{E}[cos(ttilde{X}_1)] $ denotes the characteristic funtion of $tilde{X}_1$, then
$$ varphileft( frac{t}{sqrt{n_k}} right)^{n_k} = mathbb{E}[exp{mathrm{i}t tilde{S}_{n_k}/sqrt{n_k}}] xrightarrow[ktoinfty]{} 1 $$
by the Portmanteau theorem. By taking $log|cdot|$, we have $n_k logleft| varphileft( frac{t}{sqrt{n_k}} right) right| to 0$. But since
$ varphileft( frac{t}{sqrt{n_k}} right) = 1 - 2 mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] $ by the double-angle identity;
$mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] to 0$ by the dominated convergence theorem;
it follows that
$$ n_k mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] xrightarrow[ktoinfty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ mathbb{E}[tilde{X}_1^2]
= lim_{ktoinfty} mathbb{E}left[ n_k sin^2left( frac{tilde{X}_1}{sqrt{n_k}}right) mathbf{1}_{{ |tilde{X}_1| leq frac{pi}{2}sqrt{n_k} }} right]
= 0, $$
and therefore $X_1$ is degenerate.
$endgroup$
Let me give a direct proof using characteristic functions. The setting is as follows:
$(X_n)$ and $(X'_n)$ are i.i.d.
$tilde{X}_n = X_n - X'_n$ are symmetrized variables.
$S_n = X_1 + cdots + X_n$ and $tilde{S}_n = tilde{X}_1 + cdots + tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $epsilon > 0$ such that
$$ inf_{ngeq 1} mathbb{P}left( |S_n| geq epsilonsqrt{n} right) > 0. $$
We prove the contraposition. To this end, assume that $inf_n mathbb{P}left(|S_n|geq epsilon sqrt{n}right) = 0$ for any $epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/sqrt{n_k} to 0$ in probability. This implies that $tilde{S}_{n_k}/sqrt{n_k} to 0$ in probability as well. So, if $varphi(t) = mathbb{E}[cos(ttilde{X}_1)] $ denotes the characteristic funtion of $tilde{X}_1$, then
$$ varphileft( frac{t}{sqrt{n_k}} right)^{n_k} = mathbb{E}[exp{mathrm{i}t tilde{S}_{n_k}/sqrt{n_k}}] xrightarrow[ktoinfty]{} 1 $$
by the Portmanteau theorem. By taking $log|cdot|$, we have $n_k logleft| varphileft( frac{t}{sqrt{n_k}} right) right| to 0$. But since
$ varphileft( frac{t}{sqrt{n_k}} right) = 1 - 2 mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] $ by the double-angle identity;
$mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] to 0$ by the dominated convergence theorem;
it follows that
$$ n_k mathbb{E}left[ sin^2left( frac{ttilde{X}_1}{2sqrt{n_k}}right) right] xrightarrow[ktoinfty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ mathbb{E}[tilde{X}_1^2]
= lim_{ktoinfty} mathbb{E}left[ n_k sin^2left( frac{tilde{X}_1}{sqrt{n_k}}right) mathbf{1}_{{ |tilde{X}_1| leq frac{pi}{2}sqrt{n_k} }} right]
= 0, $$
and therefore $X_1$ is degenerate.
answered Dec 5 '18 at 9:54
Sangchul LeeSangchul Lee
91.8k12165266
91.8k12165266
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
add a comment |
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
1
1
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
$begingroup$
Nice! I was looking for a direct characteristic function approach, but got stuck since we can't use Taylor expansion (no moments given). I'll have to remember this double angle trick.
$endgroup$
– maridia
Dec 5 '18 at 16:08
1
1
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
$begingroup$
Cheeky trick, Thank you!
$endgroup$
– Steve L
Dec 5 '18 at 19:00
add a comment |
$begingroup$
I think what we want to show is that $S_n/sqrt{n}$ does not converge in probability to 0.
If $X_i$ have finite mean and variance > 0, the result follows from the CLT. Thus, we only have to consider the case of infinite variance. In some sense, this should be even easier to prove since it's more likely that the sum $S_n=X_1+...+X_n$ is large. In fact, if $S_n/sqrt{n}$ converges in probability to 0, then $X_i$ must have finite variance. This was an exercise in Durrett's probability book. The idea is to symmetrize by considering the random variables $Y_i = X_i - X'_i$. Assume that $X_i$ have infinite variance. Then we can consider truncated versions of $Y_i$ with arbitrarily large finite variance. This then allows us to get a bound like $mathbb{P}(sum Y_i ge K sqrt{n}) ge 1/5$ for arbitrary $K$. (Essentially, if that probability is too small, you have no chance of obtaining the required large variance.) But the probability was supposed to go to 0 for $K > 0$. Thus, $X_i$ can be assumed to have finite variance and the CLT applies.
$endgroup$
add a comment |
$begingroup$
I think what we want to show is that $S_n/sqrt{n}$ does not converge in probability to 0.
If $X_i$ have finite mean and variance > 0, the result follows from the CLT. Thus, we only have to consider the case of infinite variance. In some sense, this should be even easier to prove since it's more likely that the sum $S_n=X_1+...+X_n$ is large. In fact, if $S_n/sqrt{n}$ converges in probability to 0, then $X_i$ must have finite variance. This was an exercise in Durrett's probability book. The idea is to symmetrize by considering the random variables $Y_i = X_i - X'_i$. Assume that $X_i$ have infinite variance. Then we can consider truncated versions of $Y_i$ with arbitrarily large finite variance. This then allows us to get a bound like $mathbb{P}(sum Y_i ge K sqrt{n}) ge 1/5$ for arbitrary $K$. (Essentially, if that probability is too small, you have no chance of obtaining the required large variance.) But the probability was supposed to go to 0 for $K > 0$. Thus, $X_i$ can be assumed to have finite variance and the CLT applies.
$endgroup$
add a comment |
$begingroup$
I think what we want to show is that $S_n/sqrt{n}$ does not converge in probability to 0.
If $X_i$ have finite mean and variance > 0, the result follows from the CLT. Thus, we only have to consider the case of infinite variance. In some sense, this should be even easier to prove since it's more likely that the sum $S_n=X_1+...+X_n$ is large. In fact, if $S_n/sqrt{n}$ converges in probability to 0, then $X_i$ must have finite variance. This was an exercise in Durrett's probability book. The idea is to symmetrize by considering the random variables $Y_i = X_i - X'_i$. Assume that $X_i$ have infinite variance. Then we can consider truncated versions of $Y_i$ with arbitrarily large finite variance. This then allows us to get a bound like $mathbb{P}(sum Y_i ge K sqrt{n}) ge 1/5$ for arbitrary $K$. (Essentially, if that probability is too small, you have no chance of obtaining the required large variance.) But the probability was supposed to go to 0 for $K > 0$. Thus, $X_i$ can be assumed to have finite variance and the CLT applies.
$endgroup$
I think what we want to show is that $S_n/sqrt{n}$ does not converge in probability to 0.
If $X_i$ have finite mean and variance > 0, the result follows from the CLT. Thus, we only have to consider the case of infinite variance. In some sense, this should be even easier to prove since it's more likely that the sum $S_n=X_1+...+X_n$ is large. In fact, if $S_n/sqrt{n}$ converges in probability to 0, then $X_i$ must have finite variance. This was an exercise in Durrett's probability book. The idea is to symmetrize by considering the random variables $Y_i = X_i - X'_i$. Assume that $X_i$ have infinite variance. Then we can consider truncated versions of $Y_i$ with arbitrarily large finite variance. This then allows us to get a bound like $mathbb{P}(sum Y_i ge K sqrt{n}) ge 1/5$ for arbitrary $K$. (Essentially, if that probability is too small, you have no chance of obtaining the required large variance.) But the probability was supposed to go to 0 for $K > 0$. Thus, $X_i$ can be assumed to have finite variance and the CLT applies.
answered Dec 5 '18 at 7:32
maridiamaridia
1,065113
1,065113
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026308%2fa-probability-inequality-probability-that-the-normalized-sum-of-i-i-d-random-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
This obviously fails if $mathbb{P}(X_i = 0) = 1$. Do you have any other assumptions on $X_i$? You're trying to prove an instance of something that resembles "small-ball probabilities", if that helps you look for relevant references.
$endgroup$
– VHarisop
Dec 4 '18 at 23:41
$begingroup$
We have that $X_i$ is nontrivial, should've added that.
$endgroup$
– Steve L
Dec 4 '18 at 23:56