Convergence of $lim_{Ntoinfty}sum_{n=1}^N exp(-Nsin^2(frac{npi}{2N}))$ and $lim_{Ntoinfty}sum_{n=1}^N...
up vote
3
down vote
favorite
I can't find the right approach to tackle the question whether
$$lim_{Ntoinfty} sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)$$
and
$$lim_{Ntoinfty} sum_{n=1}^N expBiggl(-sin^2left(frac{npi}{2N}right)Biggr)$$
converge or diverge. The fact that the limiting variable appears both as the upper bound of summation as well as in the individual summands seems to make the standard methods known to me inapplicable.
I suspect that the second limit (i.e. the one not containing $N$ in the exponent directly) does not exist, but that the first one may. I would be very grateful if you could point me to methods that allow one to determine the existence of the limits.
If a limit exists, I would also be very interested in understanding how, if at all, one could (approximately) replace the sum with an integral.
(For background, these questions have arisen during my study of the Rouse theory of polymer dynamics, e.g. in chapter 7.3.2. of Doi and Edwards, "The Theory of Polymer Dynamics". Physical explanations of how one can justify the treatment therein would be very welcome, too.)
Thank you in advance!
sequences-and-series limits convergence summation exponential-sum
New contributor
|
show 1 more comment
up vote
3
down vote
favorite
I can't find the right approach to tackle the question whether
$$lim_{Ntoinfty} sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)$$
and
$$lim_{Ntoinfty} sum_{n=1}^N expBiggl(-sin^2left(frac{npi}{2N}right)Biggr)$$
converge or diverge. The fact that the limiting variable appears both as the upper bound of summation as well as in the individual summands seems to make the standard methods known to me inapplicable.
I suspect that the second limit (i.e. the one not containing $N$ in the exponent directly) does not exist, but that the first one may. I would be very grateful if you could point me to methods that allow one to determine the existence of the limits.
If a limit exists, I would also be very interested in understanding how, if at all, one could (approximately) replace the sum with an integral.
(For background, these questions have arisen during my study of the Rouse theory of polymer dynamics, e.g. in chapter 7.3.2. of Doi and Edwards, "The Theory of Polymer Dynamics". Physical explanations of how one can justify the treatment therein would be very welcome, too.)
Thank you in advance!
sequences-and-series limits convergence summation exponential-sum
New contributor
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
1
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
2
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40
|
show 1 more comment
up vote
3
down vote
favorite
up vote
3
down vote
favorite
I can't find the right approach to tackle the question whether
$$lim_{Ntoinfty} sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)$$
and
$$lim_{Ntoinfty} sum_{n=1}^N expBiggl(-sin^2left(frac{npi}{2N}right)Biggr)$$
converge or diverge. The fact that the limiting variable appears both as the upper bound of summation as well as in the individual summands seems to make the standard methods known to me inapplicable.
I suspect that the second limit (i.e. the one not containing $N$ in the exponent directly) does not exist, but that the first one may. I would be very grateful if you could point me to methods that allow one to determine the existence of the limits.
If a limit exists, I would also be very interested in understanding how, if at all, one could (approximately) replace the sum with an integral.
(For background, these questions have arisen during my study of the Rouse theory of polymer dynamics, e.g. in chapter 7.3.2. of Doi and Edwards, "The Theory of Polymer Dynamics". Physical explanations of how one can justify the treatment therein would be very welcome, too.)
Thank you in advance!
sequences-and-series limits convergence summation exponential-sum
New contributor
I can't find the right approach to tackle the question whether
$$lim_{Ntoinfty} sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)$$
and
$$lim_{Ntoinfty} sum_{n=1}^N expBiggl(-sin^2left(frac{npi}{2N}right)Biggr)$$
converge or diverge. The fact that the limiting variable appears both as the upper bound of summation as well as in the individual summands seems to make the standard methods known to me inapplicable.
I suspect that the second limit (i.e. the one not containing $N$ in the exponent directly) does not exist, but that the first one may. I would be very grateful if you could point me to methods that allow one to determine the existence of the limits.
If a limit exists, I would also be very interested in understanding how, if at all, one could (approximately) replace the sum with an integral.
(For background, these questions have arisen during my study of the Rouse theory of polymer dynamics, e.g. in chapter 7.3.2. of Doi and Edwards, "The Theory of Polymer Dynamics". Physical explanations of how one can justify the treatment therein would be very welcome, too.)
Thank you in advance!
sequences-and-series limits convergence summation exponential-sum
sequences-and-series limits convergence summation exponential-sum
New contributor
New contributor
edited Nov 21 at 16:58
New contributor
asked Nov 21 at 12:29
Batista
185
185
New contributor
New contributor
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
1
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
2
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40
|
show 1 more comment
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
1
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
2
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
1
1
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
2
2
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40
|
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
3
down vote
accepted
Here's another way to see the limit of the first series is $infty.$ Verify that for each fixed $n,$
$$lim_{Nto infty}expleft(-N sin^2left(frac{npi}{2N}right)right) = 1.$$
Fix $N_0in mathbb N.$ Then for $Nge N_0,$ the first series is at least
$$sum_{n=1}^{N_0}expleft(-N sin^2left(frac{npi}{2N}right)right).$$
This is a finite sum, so computing its limit as $Nto infty$ is easy: we get $sum_{n=1}^{N_0}1= N_0.$ Since $N_0$ is arbitrarily large, the limit of the first series is $infty.$ Since the terms of the second series are at least as large as the those of the first series, the limit for the second series is also $infty.$
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
add a comment |
up vote
3
down vote
Both sums diverge and a general idea to prove it is to use Riemann sums, just as suggested in the comments. The second sum was proved divergent in the comments. For the first sum notice that $0 le sin x le x$ for $x in [0, pi/2]$ implies $0 geq - sin^2 x geq -x^2$ in this range and since the exponential function is increasing we have
begin{align*}
sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)
&geq sum_{n=1}^N expBigg(- frac{n^2pi^2}{4N}Bigg) \
&= sqrt{N} cdot left[ frac{1}{sqrt{N} } sum_{n=1}^N expBigg(- frac{pi^2}{4} bigg(frac{n}{sqrt{N}}bigg)^2 Bigg)right]
end{align*}
Now, the expression inside square brackets is close to the integral (you can compare them using integral test for instance):
$$
int_0^{sqrt{N}} e^{-frac{pi^2}{4} x^2} dx to int_0^infty e^{-frac{pi^2}{4} x^2} dx < infty quad text{as $N to infty$}.
$$
This implies the first integral grows at least as a multiple of $sqrt{N}$ and must diverge.
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
Here's another way to see the limit of the first series is $infty.$ Verify that for each fixed $n,$
$$lim_{Nto infty}expleft(-N sin^2left(frac{npi}{2N}right)right) = 1.$$
Fix $N_0in mathbb N.$ Then for $Nge N_0,$ the first series is at least
$$sum_{n=1}^{N_0}expleft(-N sin^2left(frac{npi}{2N}right)right).$$
This is a finite sum, so computing its limit as $Nto infty$ is easy: we get $sum_{n=1}^{N_0}1= N_0.$ Since $N_0$ is arbitrarily large, the limit of the first series is $infty.$ Since the terms of the second series are at least as large as the those of the first series, the limit for the second series is also $infty.$
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
add a comment |
up vote
3
down vote
accepted
Here's another way to see the limit of the first series is $infty.$ Verify that for each fixed $n,$
$$lim_{Nto infty}expleft(-N sin^2left(frac{npi}{2N}right)right) = 1.$$
Fix $N_0in mathbb N.$ Then for $Nge N_0,$ the first series is at least
$$sum_{n=1}^{N_0}expleft(-N sin^2left(frac{npi}{2N}right)right).$$
This is a finite sum, so computing its limit as $Nto infty$ is easy: we get $sum_{n=1}^{N_0}1= N_0.$ Since $N_0$ is arbitrarily large, the limit of the first series is $infty.$ Since the terms of the second series are at least as large as the those of the first series, the limit for the second series is also $infty.$
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
add a comment |
up vote
3
down vote
accepted
up vote
3
down vote
accepted
Here's another way to see the limit of the first series is $infty.$ Verify that for each fixed $n,$
$$lim_{Nto infty}expleft(-N sin^2left(frac{npi}{2N}right)right) = 1.$$
Fix $N_0in mathbb N.$ Then for $Nge N_0,$ the first series is at least
$$sum_{n=1}^{N_0}expleft(-N sin^2left(frac{npi}{2N}right)right).$$
This is a finite sum, so computing its limit as $Nto infty$ is easy: we get $sum_{n=1}^{N_0}1= N_0.$ Since $N_0$ is arbitrarily large, the limit of the first series is $infty.$ Since the terms of the second series are at least as large as the those of the first series, the limit for the second series is also $infty.$
Here's another way to see the limit of the first series is $infty.$ Verify that for each fixed $n,$
$$lim_{Nto infty}expleft(-N sin^2left(frac{npi}{2N}right)right) = 1.$$
Fix $N_0in mathbb N.$ Then for $Nge N_0,$ the first series is at least
$$sum_{n=1}^{N_0}expleft(-N sin^2left(frac{npi}{2N}right)right).$$
This is a finite sum, so computing its limit as $Nto infty$ is easy: we get $sum_{n=1}^{N_0}1= N_0.$ Since $N_0$ is arbitrarily large, the limit of the first series is $infty.$ Since the terms of the second series are at least as large as the those of the first series, the limit for the second series is also $infty.$
edited Nov 21 at 23:28
answered Nov 21 at 21:02
zhw.
70.4k43075
70.4k43075
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
add a comment |
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
Thank you very much, that's a very elegant solution!
– Batista
Nov 22 at 16:28
add a comment |
up vote
3
down vote
Both sums diverge and a general idea to prove it is to use Riemann sums, just as suggested in the comments. The second sum was proved divergent in the comments. For the first sum notice that $0 le sin x le x$ for $x in [0, pi/2]$ implies $0 geq - sin^2 x geq -x^2$ in this range and since the exponential function is increasing we have
begin{align*}
sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)
&geq sum_{n=1}^N expBigg(- frac{n^2pi^2}{4N}Bigg) \
&= sqrt{N} cdot left[ frac{1}{sqrt{N} } sum_{n=1}^N expBigg(- frac{pi^2}{4} bigg(frac{n}{sqrt{N}}bigg)^2 Bigg)right]
end{align*}
Now, the expression inside square brackets is close to the integral (you can compare them using integral test for instance):
$$
int_0^{sqrt{N}} e^{-frac{pi^2}{4} x^2} dx to int_0^infty e^{-frac{pi^2}{4} x^2} dx < infty quad text{as $N to infty$}.
$$
This implies the first integral grows at least as a multiple of $sqrt{N}$ and must diverge.
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
add a comment |
up vote
3
down vote
Both sums diverge and a general idea to prove it is to use Riemann sums, just as suggested in the comments. The second sum was proved divergent in the comments. For the first sum notice that $0 le sin x le x$ for $x in [0, pi/2]$ implies $0 geq - sin^2 x geq -x^2$ in this range and since the exponential function is increasing we have
begin{align*}
sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)
&geq sum_{n=1}^N expBigg(- frac{n^2pi^2}{4N}Bigg) \
&= sqrt{N} cdot left[ frac{1}{sqrt{N} } sum_{n=1}^N expBigg(- frac{pi^2}{4} bigg(frac{n}{sqrt{N}}bigg)^2 Bigg)right]
end{align*}
Now, the expression inside square brackets is close to the integral (you can compare them using integral test for instance):
$$
int_0^{sqrt{N}} e^{-frac{pi^2}{4} x^2} dx to int_0^infty e^{-frac{pi^2}{4} x^2} dx < infty quad text{as $N to infty$}.
$$
This implies the first integral grows at least as a multiple of $sqrt{N}$ and must diverge.
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
add a comment |
up vote
3
down vote
up vote
3
down vote
Both sums diverge and a general idea to prove it is to use Riemann sums, just as suggested in the comments. The second sum was proved divergent in the comments. For the first sum notice that $0 le sin x le x$ for $x in [0, pi/2]$ implies $0 geq - sin^2 x geq -x^2$ in this range and since the exponential function is increasing we have
begin{align*}
sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)
&geq sum_{n=1}^N expBigg(- frac{n^2pi^2}{4N}Bigg) \
&= sqrt{N} cdot left[ frac{1}{sqrt{N} } sum_{n=1}^N expBigg(- frac{pi^2}{4} bigg(frac{n}{sqrt{N}}bigg)^2 Bigg)right]
end{align*}
Now, the expression inside square brackets is close to the integral (you can compare them using integral test for instance):
$$
int_0^{sqrt{N}} e^{-frac{pi^2}{4} x^2} dx to int_0^infty e^{-frac{pi^2}{4} x^2} dx < infty quad text{as $N to infty$}.
$$
This implies the first integral grows at least as a multiple of $sqrt{N}$ and must diverge.
Both sums diverge and a general idea to prove it is to use Riemann sums, just as suggested in the comments. The second sum was proved divergent in the comments. For the first sum notice that $0 le sin x le x$ for $x in [0, pi/2]$ implies $0 geq - sin^2 x geq -x^2$ in this range and since the exponential function is increasing we have
begin{align*}
sum_{n=1}^N expBigg(-N sin^2left(frac{npi}{2N}right)Bigg)
&geq sum_{n=1}^N expBigg(- frac{n^2pi^2}{4N}Bigg) \
&= sqrt{N} cdot left[ frac{1}{sqrt{N} } sum_{n=1}^N expBigg(- frac{pi^2}{4} bigg(frac{n}{sqrt{N}}bigg)^2 Bigg)right]
end{align*}
Now, the expression inside square brackets is close to the integral (you can compare them using integral test for instance):
$$
int_0^{sqrt{N}} e^{-frac{pi^2}{4} x^2} dx to int_0^infty e^{-frac{pi^2}{4} x^2} dx < infty quad text{as $N to infty$}.
$$
This implies the first integral grows at least as a multiple of $sqrt{N}$ and must diverge.
answered Nov 21 at 17:48
Daniel
1,22628
1,22628
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
add a comment |
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
Thank you so much for your insightful answer! I wish I could accept both answers.
– Batista
Nov 22 at 16:22
add a comment |
Batista is a new contributor. Be nice, and check out our Code of Conduct.
Batista is a new contributor. Be nice, and check out our Code of Conduct.
Batista is a new contributor. Be nice, and check out our Code of Conduct.
Batista is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007664%2fconvergence-of-lim-n-to-infty-sum-n-1n-exp-n-sin2-fracn-pi2n-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
ru.wikipedia.org/wiki/Логарифмический_признак_сходимости
– Samvel Safaryan
Nov 21 at 13:35
1
@SamvelSafaryan: Thanks for the link. I'm afraid, though, that this method is not applicable, or is it? I assume the theorem holds only if the $a_n$ do not depend on the limiting variable $N$ (i.e. the maximum index of summation). Please correct me if I'm wrong, I might just have conceptual difficulties.
– Batista
Nov 21 at 13:55
The second sum diverges to $infty $ because the same sum divided by $N$ tends to $int_{0}^{1}e^{-sin^2(pi x/2)},dx>0$.
– Paramanand Singh
Nov 21 at 14:21
@ParamanandSingh: That's an elegant argument, thanks!
– Batista
Nov 21 at 14:47
2
Replacing $e^{-Ncdots}$ with $e^{-N^2cdots}$ gives a more interesting problem with the limit converging to a finite value.
– Winther
Nov 21 at 23:40