The probability behind Bitcoin.
$begingroup$
I'm trying to understand the end of the foundational paper of Bitcoin In which the author plays a bit with probability to show how his system work. I'll try to explain the technicalities of bitcoin in exchange for some orientations in the probability calculations involved.
We consider the scenario of an attacker trying to generate an
alternate chain faster than the honest chain. The race between the
honest chain and an attacker chain can be characterized as a Binomial
Random Walk. The success event is the honest chain being extended by
one block, increasing its lead by +1, and the failure event is the
attacker's chain being extended by one block, reducing the gap by -1.
The probability of an attacker catching up from a given deficit is
analogous to a Gambler's Ruin problem.
Here I'm assuming we are talking about a binomial distribution not about a negative binomial distribution. Some people however consider it corresponds to a negative binomial. I'm not aware that stating that it is a random walk has special implications.
Suppose a gambler with unlimited credit starts at a deficit and plays
potentially an infinite number of trials to try to reach break-even.
We can calculate the probability he ever reaches break-even, or that
an attacker ever catches up with the honest chain, as follows:
p = probability an honest node finds the next block
q = probability the attacker finds the next block
$q_z$ = probability the attacker will
ever catch up from z blocks behind
$q_z= 1 ;if p leq q$ and $q_z = big(frac{q}{p}big)^z ; if p > q$
Given our assumption that p > q, the probability drops exponentially
as the number of blocks the attacker has to catch up with increases.
With the odds against him, if he doesn't make a lucky lunge forward
early on, his chances become vanishingly small as he falls further
behind.
This all makes sense to me. And maybe here is where one could say that $q_z$ follows a negative binomial distribution. However, I expect the number of failures in a negative binomial distribution to remain constant. However in this situation every time that we get a success the difference between the honest chain and the attacker's chain increases, so that the attacker needs more successes to reach the honest chain. Maybe you can clarify this point.
We now consider how long the recipient of a new transaction needs to
wait before being sufficiently certain the sender can't change the
transaction. We assume the sender is an attacker who wants to make the
recipient believe he paid him for a while, then switch it to pay back
to himself after some time has passed. The receiver will be alerted
when that happens, but the sender hopes it will be too late.
The receiver generates a new key pair and gives the public key to the
sender shortly before signing. This prevents the sender from preparing
a chain of blocks ahead of time by working on it continuously until he
is lucky enough to get far enough ahead, then executing the
transaction at that moment. Once the transaction is sent, the
dishonest sender starts working in secret on a parallel chain
containing an alternate version of his transaction.
The recipient waits until the transaction has been added to a block
and z blocks have been linked after it. He doesn't know the exact
amount of progress the attacker has made, but assuming the honest
blocks took the average expected time per block, the attacker's
potential progress will be a Poisson distribution with expected value $lambda = zfrac{q}{p}$
Ok, so here definitely I think that we are using the fact that binomial negative or binomial distribution are related in the limit with Poisson's distribution, summarizing: $BN(n,p) rightarrow_{n rightarrow infty, lambda = n(1-p)} Poisson(lambda)$ and $B(n,p) rightarrow_{n rightarrow infty,lambda = np}$. Looking at the previous formulas I suspect that the author is going for the binomial one. But still I would need some clarification.
To get the probability the attacker could still catch up now, we
multiply the Poisson density for each amount of progress he could have
made by the probability he could catch up from that point: $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} big(frac{q}{p}big)^{z-k}$ if $k leq z$ and $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} cdot 1$ if $k > z$
Could you give me some insight to better understand this formula?
What I ask for
So, in summary I'm asking you to:
Make an explicit statement of the random variable that is studied here, what is its distribution and to state whether random walks are important to understand the calculations that I showed.
According to your answer to point one describe which of the limits I showed is used to derive the Poisson distribution and explain the idea of the last quotation. I don't really get what is being computed there.
Please if you don't understand something about how bitcoin works comment.
probability probability-distributions
$endgroup$
add a comment |
$begingroup$
I'm trying to understand the end of the foundational paper of Bitcoin In which the author plays a bit with probability to show how his system work. I'll try to explain the technicalities of bitcoin in exchange for some orientations in the probability calculations involved.
We consider the scenario of an attacker trying to generate an
alternate chain faster than the honest chain. The race between the
honest chain and an attacker chain can be characterized as a Binomial
Random Walk. The success event is the honest chain being extended by
one block, increasing its lead by +1, and the failure event is the
attacker's chain being extended by one block, reducing the gap by -1.
The probability of an attacker catching up from a given deficit is
analogous to a Gambler's Ruin problem.
Here I'm assuming we are talking about a binomial distribution not about a negative binomial distribution. Some people however consider it corresponds to a negative binomial. I'm not aware that stating that it is a random walk has special implications.
Suppose a gambler with unlimited credit starts at a deficit and plays
potentially an infinite number of trials to try to reach break-even.
We can calculate the probability he ever reaches break-even, or that
an attacker ever catches up with the honest chain, as follows:
p = probability an honest node finds the next block
q = probability the attacker finds the next block
$q_z$ = probability the attacker will
ever catch up from z blocks behind
$q_z= 1 ;if p leq q$ and $q_z = big(frac{q}{p}big)^z ; if p > q$
Given our assumption that p > q, the probability drops exponentially
as the number of blocks the attacker has to catch up with increases.
With the odds against him, if he doesn't make a lucky lunge forward
early on, his chances become vanishingly small as he falls further
behind.
This all makes sense to me. And maybe here is where one could say that $q_z$ follows a negative binomial distribution. However, I expect the number of failures in a negative binomial distribution to remain constant. However in this situation every time that we get a success the difference between the honest chain and the attacker's chain increases, so that the attacker needs more successes to reach the honest chain. Maybe you can clarify this point.
We now consider how long the recipient of a new transaction needs to
wait before being sufficiently certain the sender can't change the
transaction. We assume the sender is an attacker who wants to make the
recipient believe he paid him for a while, then switch it to pay back
to himself after some time has passed. The receiver will be alerted
when that happens, but the sender hopes it will be too late.
The receiver generates a new key pair and gives the public key to the
sender shortly before signing. This prevents the sender from preparing
a chain of blocks ahead of time by working on it continuously until he
is lucky enough to get far enough ahead, then executing the
transaction at that moment. Once the transaction is sent, the
dishonest sender starts working in secret on a parallel chain
containing an alternate version of his transaction.
The recipient waits until the transaction has been added to a block
and z blocks have been linked after it. He doesn't know the exact
amount of progress the attacker has made, but assuming the honest
blocks took the average expected time per block, the attacker's
potential progress will be a Poisson distribution with expected value $lambda = zfrac{q}{p}$
Ok, so here definitely I think that we are using the fact that binomial negative or binomial distribution are related in the limit with Poisson's distribution, summarizing: $BN(n,p) rightarrow_{n rightarrow infty, lambda = n(1-p)} Poisson(lambda)$ and $B(n,p) rightarrow_{n rightarrow infty,lambda = np}$. Looking at the previous formulas I suspect that the author is going for the binomial one. But still I would need some clarification.
To get the probability the attacker could still catch up now, we
multiply the Poisson density for each amount of progress he could have
made by the probability he could catch up from that point: $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} big(frac{q}{p}big)^{z-k}$ if $k leq z$ and $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} cdot 1$ if $k > z$
Could you give me some insight to better understand this formula?
What I ask for
So, in summary I'm asking you to:
Make an explicit statement of the random variable that is studied here, what is its distribution and to state whether random walks are important to understand the calculations that I showed.
According to your answer to point one describe which of the limits I showed is used to derive the Poisson distribution and explain the idea of the last quotation. I don't really get what is being computed there.
Please if you don't understand something about how bitcoin works comment.
probability probability-distributions
$endgroup$
add a comment |
$begingroup$
I'm trying to understand the end of the foundational paper of Bitcoin In which the author plays a bit with probability to show how his system work. I'll try to explain the technicalities of bitcoin in exchange for some orientations in the probability calculations involved.
We consider the scenario of an attacker trying to generate an
alternate chain faster than the honest chain. The race between the
honest chain and an attacker chain can be characterized as a Binomial
Random Walk. The success event is the honest chain being extended by
one block, increasing its lead by +1, and the failure event is the
attacker's chain being extended by one block, reducing the gap by -1.
The probability of an attacker catching up from a given deficit is
analogous to a Gambler's Ruin problem.
Here I'm assuming we are talking about a binomial distribution not about a negative binomial distribution. Some people however consider it corresponds to a negative binomial. I'm not aware that stating that it is a random walk has special implications.
Suppose a gambler with unlimited credit starts at a deficit and plays
potentially an infinite number of trials to try to reach break-even.
We can calculate the probability he ever reaches break-even, or that
an attacker ever catches up with the honest chain, as follows:
p = probability an honest node finds the next block
q = probability the attacker finds the next block
$q_z$ = probability the attacker will
ever catch up from z blocks behind
$q_z= 1 ;if p leq q$ and $q_z = big(frac{q}{p}big)^z ; if p > q$
Given our assumption that p > q, the probability drops exponentially
as the number of blocks the attacker has to catch up with increases.
With the odds against him, if he doesn't make a lucky lunge forward
early on, his chances become vanishingly small as he falls further
behind.
This all makes sense to me. And maybe here is where one could say that $q_z$ follows a negative binomial distribution. However, I expect the number of failures in a negative binomial distribution to remain constant. However in this situation every time that we get a success the difference between the honest chain and the attacker's chain increases, so that the attacker needs more successes to reach the honest chain. Maybe you can clarify this point.
We now consider how long the recipient of a new transaction needs to
wait before being sufficiently certain the sender can't change the
transaction. We assume the sender is an attacker who wants to make the
recipient believe he paid him for a while, then switch it to pay back
to himself after some time has passed. The receiver will be alerted
when that happens, but the sender hopes it will be too late.
The receiver generates a new key pair and gives the public key to the
sender shortly before signing. This prevents the sender from preparing
a chain of blocks ahead of time by working on it continuously until he
is lucky enough to get far enough ahead, then executing the
transaction at that moment. Once the transaction is sent, the
dishonest sender starts working in secret on a parallel chain
containing an alternate version of his transaction.
The recipient waits until the transaction has been added to a block
and z blocks have been linked after it. He doesn't know the exact
amount of progress the attacker has made, but assuming the honest
blocks took the average expected time per block, the attacker's
potential progress will be a Poisson distribution with expected value $lambda = zfrac{q}{p}$
Ok, so here definitely I think that we are using the fact that binomial negative or binomial distribution are related in the limit with Poisson's distribution, summarizing: $BN(n,p) rightarrow_{n rightarrow infty, lambda = n(1-p)} Poisson(lambda)$ and $B(n,p) rightarrow_{n rightarrow infty,lambda = np}$. Looking at the previous formulas I suspect that the author is going for the binomial one. But still I would need some clarification.
To get the probability the attacker could still catch up now, we
multiply the Poisson density for each amount of progress he could have
made by the probability he could catch up from that point: $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} big(frac{q}{p}big)^{z-k}$ if $k leq z$ and $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} cdot 1$ if $k > z$
Could you give me some insight to better understand this formula?
What I ask for
So, in summary I'm asking you to:
Make an explicit statement of the random variable that is studied here, what is its distribution and to state whether random walks are important to understand the calculations that I showed.
According to your answer to point one describe which of the limits I showed is used to derive the Poisson distribution and explain the idea of the last quotation. I don't really get what is being computed there.
Please if you don't understand something about how bitcoin works comment.
probability probability-distributions
$endgroup$
I'm trying to understand the end of the foundational paper of Bitcoin In which the author plays a bit with probability to show how his system work. I'll try to explain the technicalities of bitcoin in exchange for some orientations in the probability calculations involved.
We consider the scenario of an attacker trying to generate an
alternate chain faster than the honest chain. The race between the
honest chain and an attacker chain can be characterized as a Binomial
Random Walk. The success event is the honest chain being extended by
one block, increasing its lead by +1, and the failure event is the
attacker's chain being extended by one block, reducing the gap by -1.
The probability of an attacker catching up from a given deficit is
analogous to a Gambler's Ruin problem.
Here I'm assuming we are talking about a binomial distribution not about a negative binomial distribution. Some people however consider it corresponds to a negative binomial. I'm not aware that stating that it is a random walk has special implications.
Suppose a gambler with unlimited credit starts at a deficit and plays
potentially an infinite number of trials to try to reach break-even.
We can calculate the probability he ever reaches break-even, or that
an attacker ever catches up with the honest chain, as follows:
p = probability an honest node finds the next block
q = probability the attacker finds the next block
$q_z$ = probability the attacker will
ever catch up from z blocks behind
$q_z= 1 ;if p leq q$ and $q_z = big(frac{q}{p}big)^z ; if p > q$
Given our assumption that p > q, the probability drops exponentially
as the number of blocks the attacker has to catch up with increases.
With the odds against him, if he doesn't make a lucky lunge forward
early on, his chances become vanishingly small as he falls further
behind.
This all makes sense to me. And maybe here is where one could say that $q_z$ follows a negative binomial distribution. However, I expect the number of failures in a negative binomial distribution to remain constant. However in this situation every time that we get a success the difference between the honest chain and the attacker's chain increases, so that the attacker needs more successes to reach the honest chain. Maybe you can clarify this point.
We now consider how long the recipient of a new transaction needs to
wait before being sufficiently certain the sender can't change the
transaction. We assume the sender is an attacker who wants to make the
recipient believe he paid him for a while, then switch it to pay back
to himself after some time has passed. The receiver will be alerted
when that happens, but the sender hopes it will be too late.
The receiver generates a new key pair and gives the public key to the
sender shortly before signing. This prevents the sender from preparing
a chain of blocks ahead of time by working on it continuously until he
is lucky enough to get far enough ahead, then executing the
transaction at that moment. Once the transaction is sent, the
dishonest sender starts working in secret on a parallel chain
containing an alternate version of his transaction.
The recipient waits until the transaction has been added to a block
and z blocks have been linked after it. He doesn't know the exact
amount of progress the attacker has made, but assuming the honest
blocks took the average expected time per block, the attacker's
potential progress will be a Poisson distribution with expected value $lambda = zfrac{q}{p}$
Ok, so here definitely I think that we are using the fact that binomial negative or binomial distribution are related in the limit with Poisson's distribution, summarizing: $BN(n,p) rightarrow_{n rightarrow infty, lambda = n(1-p)} Poisson(lambda)$ and $B(n,p) rightarrow_{n rightarrow infty,lambda = np}$. Looking at the previous formulas I suspect that the author is going for the binomial one. But still I would need some clarification.
To get the probability the attacker could still catch up now, we
multiply the Poisson density for each amount of progress he could have
made by the probability he could catch up from that point: $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} big(frac{q}{p}big)^{z-k}$ if $k leq z$ and $sum_{k geq 0} frac{lambda^ke^{-k}}{k!} cdot 1$ if $k > z$
Could you give me some insight to better understand this formula?
What I ask for
So, in summary I'm asking you to:
Make an explicit statement of the random variable that is studied here, what is its distribution and to state whether random walks are important to understand the calculations that I showed.
According to your answer to point one describe which of the limits I showed is used to derive the Poisson distribution and explain the idea of the last quotation. I don't really get what is being computed there.
Please if you don't understand something about how bitcoin works comment.
probability probability-distributions
probability probability-distributions
asked Jul 12 '17 at 23:15
JavierJavier
2,08021234
2,08021234
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The probability being computed is the probability that the attacker will catch up. We can call that event $A$ so we are computing $P(A).$
The way we are computing it is via the law of total probability. In other words, splitting it up into cases. The cases chose are how far along in their attack the attacker happens to be after $z$ blocks have been added. We let $B_k$ be the event that the attacker has made $k$ blocks of progress. We are assuming (more on this later) that $P(B_k) = frac{1}{k!}lambda^k e^{-lambda}.$ We know that given that the attack has made $k$ blocks, the probability they catch up is (1) $1$ if they have already caught up (i.e. $kge z)$ (2) $(q/p)^{z-k}$ if they haven't caught up and are $z-k$ blocks behind.
Then we just decompose by the law of total probability: $$P(A) = sum_k P(A|B_k)P(B_k) $$ where $P(B_k)=frac{1}{k!}lambda^k e^{-lambda}$ and $P(A|B_k) = (q/p)^{z-k}$ for $k<z$ and $P(A|B_k) = 1$ for $k>z.$
So the probability they catch up is the probability they haven't got any blocks yet and they catch up, plus the probability that they've made one block and they catch up, plus the probability they've made two blocks and they catch up, etc. Note that as $k$ increases $P(A|B_k)$ (the probability the catch up given that they have made $k$ blocks) increases, eventually becoming $1,$ while $P(B_k)$ (the probability they've made $k$ blocks when the honest nodes have made $z$) decreases.
Now for the part about binomials. We can consider both the honest network and the attacker to be independent poisson processes, producing blocks at rate $lambda_p$ and $lambda_q$ so that the probability the next block is produced by honest network is $p=frac{lambda_p}{lambda_p+lambda_q}$ and similarly $q=1-p$ for the attacker. In this case the distribution of number of blocks $k$ the attacker has produced is negative binomial. This is because it is the number of successes for the attacker before $z$ failures (where a failure is the honest network producing a block). This seems to me a reasonable model of what's going on.
So we should instead have $$ P(B_k) = {k+z-1choose k } q^k(1-q)^z.$$
Now, it happens sometimes that a negative binomial (or a binomial) can be approximated by a Poisson distribution. When is this true? Well, in general, it's when you have a lot of a attempts with a small probability. This would be the case when $z$ is large and $q$ is very small. This doesn't seem like a particularly natural limit for this problem as we'd imagine moderate sized $z$ and $q$ being of interest. An answerer Bob at the other site says something to the contrary but they're confusing the trials of the nonce for the trials to get the next block. The former process is actually a (not negative) binomial distribution which is a poisson process to a good approximation.
What Satoshi does is assume that the honest blocks take exactly their average time $$T_z = z/lambda_p$$ to get the $z$ blocks. Then since we're still assuming the attacker generates blocks in a Poisson process, the number of blocks the attacker creates in that time is Poisson distributed with mean $$lambda_q T_z = frac{lambda_qz}{lambda_p} = frac{q}{p}z.$$
But it seems a questionable approximation to suddenly have the honest nodes be deterministic while the attacker still has fluctuations. Of course, there will be a limit where this is effectively true, and it's in exactly the same limit as where the negative binomial above can be approximated as a Poisson.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2356763%2fthe-probability-behind-bitcoin%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The probability being computed is the probability that the attacker will catch up. We can call that event $A$ so we are computing $P(A).$
The way we are computing it is via the law of total probability. In other words, splitting it up into cases. The cases chose are how far along in their attack the attacker happens to be after $z$ blocks have been added. We let $B_k$ be the event that the attacker has made $k$ blocks of progress. We are assuming (more on this later) that $P(B_k) = frac{1}{k!}lambda^k e^{-lambda}.$ We know that given that the attack has made $k$ blocks, the probability they catch up is (1) $1$ if they have already caught up (i.e. $kge z)$ (2) $(q/p)^{z-k}$ if they haven't caught up and are $z-k$ blocks behind.
Then we just decompose by the law of total probability: $$P(A) = sum_k P(A|B_k)P(B_k) $$ where $P(B_k)=frac{1}{k!}lambda^k e^{-lambda}$ and $P(A|B_k) = (q/p)^{z-k}$ for $k<z$ and $P(A|B_k) = 1$ for $k>z.$
So the probability they catch up is the probability they haven't got any blocks yet and they catch up, plus the probability that they've made one block and they catch up, plus the probability they've made two blocks and they catch up, etc. Note that as $k$ increases $P(A|B_k)$ (the probability the catch up given that they have made $k$ blocks) increases, eventually becoming $1,$ while $P(B_k)$ (the probability they've made $k$ blocks when the honest nodes have made $z$) decreases.
Now for the part about binomials. We can consider both the honest network and the attacker to be independent poisson processes, producing blocks at rate $lambda_p$ and $lambda_q$ so that the probability the next block is produced by honest network is $p=frac{lambda_p}{lambda_p+lambda_q}$ and similarly $q=1-p$ for the attacker. In this case the distribution of number of blocks $k$ the attacker has produced is negative binomial. This is because it is the number of successes for the attacker before $z$ failures (where a failure is the honest network producing a block). This seems to me a reasonable model of what's going on.
So we should instead have $$ P(B_k) = {k+z-1choose k } q^k(1-q)^z.$$
Now, it happens sometimes that a negative binomial (or a binomial) can be approximated by a Poisson distribution. When is this true? Well, in general, it's when you have a lot of a attempts with a small probability. This would be the case when $z$ is large and $q$ is very small. This doesn't seem like a particularly natural limit for this problem as we'd imagine moderate sized $z$ and $q$ being of interest. An answerer Bob at the other site says something to the contrary but they're confusing the trials of the nonce for the trials to get the next block. The former process is actually a (not negative) binomial distribution which is a poisson process to a good approximation.
What Satoshi does is assume that the honest blocks take exactly their average time $$T_z = z/lambda_p$$ to get the $z$ blocks. Then since we're still assuming the attacker generates blocks in a Poisson process, the number of blocks the attacker creates in that time is Poisson distributed with mean $$lambda_q T_z = frac{lambda_qz}{lambda_p} = frac{q}{p}z.$$
But it seems a questionable approximation to suddenly have the honest nodes be deterministic while the attacker still has fluctuations. Of course, there will be a limit where this is effectively true, and it's in exactly the same limit as where the negative binomial above can be approximated as a Poisson.
$endgroup$
add a comment |
$begingroup$
The probability being computed is the probability that the attacker will catch up. We can call that event $A$ so we are computing $P(A).$
The way we are computing it is via the law of total probability. In other words, splitting it up into cases. The cases chose are how far along in their attack the attacker happens to be after $z$ blocks have been added. We let $B_k$ be the event that the attacker has made $k$ blocks of progress. We are assuming (more on this later) that $P(B_k) = frac{1}{k!}lambda^k e^{-lambda}.$ We know that given that the attack has made $k$ blocks, the probability they catch up is (1) $1$ if they have already caught up (i.e. $kge z)$ (2) $(q/p)^{z-k}$ if they haven't caught up and are $z-k$ blocks behind.
Then we just decompose by the law of total probability: $$P(A) = sum_k P(A|B_k)P(B_k) $$ where $P(B_k)=frac{1}{k!}lambda^k e^{-lambda}$ and $P(A|B_k) = (q/p)^{z-k}$ for $k<z$ and $P(A|B_k) = 1$ for $k>z.$
So the probability they catch up is the probability they haven't got any blocks yet and they catch up, plus the probability that they've made one block and they catch up, plus the probability they've made two blocks and they catch up, etc. Note that as $k$ increases $P(A|B_k)$ (the probability the catch up given that they have made $k$ blocks) increases, eventually becoming $1,$ while $P(B_k)$ (the probability they've made $k$ blocks when the honest nodes have made $z$) decreases.
Now for the part about binomials. We can consider both the honest network and the attacker to be independent poisson processes, producing blocks at rate $lambda_p$ and $lambda_q$ so that the probability the next block is produced by honest network is $p=frac{lambda_p}{lambda_p+lambda_q}$ and similarly $q=1-p$ for the attacker. In this case the distribution of number of blocks $k$ the attacker has produced is negative binomial. This is because it is the number of successes for the attacker before $z$ failures (where a failure is the honest network producing a block). This seems to me a reasonable model of what's going on.
So we should instead have $$ P(B_k) = {k+z-1choose k } q^k(1-q)^z.$$
Now, it happens sometimes that a negative binomial (or a binomial) can be approximated by a Poisson distribution. When is this true? Well, in general, it's when you have a lot of a attempts with a small probability. This would be the case when $z$ is large and $q$ is very small. This doesn't seem like a particularly natural limit for this problem as we'd imagine moderate sized $z$ and $q$ being of interest. An answerer Bob at the other site says something to the contrary but they're confusing the trials of the nonce for the trials to get the next block. The former process is actually a (not negative) binomial distribution which is a poisson process to a good approximation.
What Satoshi does is assume that the honest blocks take exactly their average time $$T_z = z/lambda_p$$ to get the $z$ blocks. Then since we're still assuming the attacker generates blocks in a Poisson process, the number of blocks the attacker creates in that time is Poisson distributed with mean $$lambda_q T_z = frac{lambda_qz}{lambda_p} = frac{q}{p}z.$$
But it seems a questionable approximation to suddenly have the honest nodes be deterministic while the attacker still has fluctuations. Of course, there will be a limit where this is effectively true, and it's in exactly the same limit as where the negative binomial above can be approximated as a Poisson.
$endgroup$
add a comment |
$begingroup$
The probability being computed is the probability that the attacker will catch up. We can call that event $A$ so we are computing $P(A).$
The way we are computing it is via the law of total probability. In other words, splitting it up into cases. The cases chose are how far along in their attack the attacker happens to be after $z$ blocks have been added. We let $B_k$ be the event that the attacker has made $k$ blocks of progress. We are assuming (more on this later) that $P(B_k) = frac{1}{k!}lambda^k e^{-lambda}.$ We know that given that the attack has made $k$ blocks, the probability they catch up is (1) $1$ if they have already caught up (i.e. $kge z)$ (2) $(q/p)^{z-k}$ if they haven't caught up and are $z-k$ blocks behind.
Then we just decompose by the law of total probability: $$P(A) = sum_k P(A|B_k)P(B_k) $$ where $P(B_k)=frac{1}{k!}lambda^k e^{-lambda}$ and $P(A|B_k) = (q/p)^{z-k}$ for $k<z$ and $P(A|B_k) = 1$ for $k>z.$
So the probability they catch up is the probability they haven't got any blocks yet and they catch up, plus the probability that they've made one block and they catch up, plus the probability they've made two blocks and they catch up, etc. Note that as $k$ increases $P(A|B_k)$ (the probability the catch up given that they have made $k$ blocks) increases, eventually becoming $1,$ while $P(B_k)$ (the probability they've made $k$ blocks when the honest nodes have made $z$) decreases.
Now for the part about binomials. We can consider both the honest network and the attacker to be independent poisson processes, producing blocks at rate $lambda_p$ and $lambda_q$ so that the probability the next block is produced by honest network is $p=frac{lambda_p}{lambda_p+lambda_q}$ and similarly $q=1-p$ for the attacker. In this case the distribution of number of blocks $k$ the attacker has produced is negative binomial. This is because it is the number of successes for the attacker before $z$ failures (where a failure is the honest network producing a block). This seems to me a reasonable model of what's going on.
So we should instead have $$ P(B_k) = {k+z-1choose k } q^k(1-q)^z.$$
Now, it happens sometimes that a negative binomial (or a binomial) can be approximated by a Poisson distribution. When is this true? Well, in general, it's when you have a lot of a attempts with a small probability. This would be the case when $z$ is large and $q$ is very small. This doesn't seem like a particularly natural limit for this problem as we'd imagine moderate sized $z$ and $q$ being of interest. An answerer Bob at the other site says something to the contrary but they're confusing the trials of the nonce for the trials to get the next block. The former process is actually a (not negative) binomial distribution which is a poisson process to a good approximation.
What Satoshi does is assume that the honest blocks take exactly their average time $$T_z = z/lambda_p$$ to get the $z$ blocks. Then since we're still assuming the attacker generates blocks in a Poisson process, the number of blocks the attacker creates in that time is Poisson distributed with mean $$lambda_q T_z = frac{lambda_qz}{lambda_p} = frac{q}{p}z.$$
But it seems a questionable approximation to suddenly have the honest nodes be deterministic while the attacker still has fluctuations. Of course, there will be a limit where this is effectively true, and it's in exactly the same limit as where the negative binomial above can be approximated as a Poisson.
$endgroup$
The probability being computed is the probability that the attacker will catch up. We can call that event $A$ so we are computing $P(A).$
The way we are computing it is via the law of total probability. In other words, splitting it up into cases. The cases chose are how far along in their attack the attacker happens to be after $z$ blocks have been added. We let $B_k$ be the event that the attacker has made $k$ blocks of progress. We are assuming (more on this later) that $P(B_k) = frac{1}{k!}lambda^k e^{-lambda}.$ We know that given that the attack has made $k$ blocks, the probability they catch up is (1) $1$ if they have already caught up (i.e. $kge z)$ (2) $(q/p)^{z-k}$ if they haven't caught up and are $z-k$ blocks behind.
Then we just decompose by the law of total probability: $$P(A) = sum_k P(A|B_k)P(B_k) $$ where $P(B_k)=frac{1}{k!}lambda^k e^{-lambda}$ and $P(A|B_k) = (q/p)^{z-k}$ for $k<z$ and $P(A|B_k) = 1$ for $k>z.$
So the probability they catch up is the probability they haven't got any blocks yet and they catch up, plus the probability that they've made one block and they catch up, plus the probability they've made two blocks and they catch up, etc. Note that as $k$ increases $P(A|B_k)$ (the probability the catch up given that they have made $k$ blocks) increases, eventually becoming $1,$ while $P(B_k)$ (the probability they've made $k$ blocks when the honest nodes have made $z$) decreases.
Now for the part about binomials. We can consider both the honest network and the attacker to be independent poisson processes, producing blocks at rate $lambda_p$ and $lambda_q$ so that the probability the next block is produced by honest network is $p=frac{lambda_p}{lambda_p+lambda_q}$ and similarly $q=1-p$ for the attacker. In this case the distribution of number of blocks $k$ the attacker has produced is negative binomial. This is because it is the number of successes for the attacker before $z$ failures (where a failure is the honest network producing a block). This seems to me a reasonable model of what's going on.
So we should instead have $$ P(B_k) = {k+z-1choose k } q^k(1-q)^z.$$
Now, it happens sometimes that a negative binomial (or a binomial) can be approximated by a Poisson distribution. When is this true? Well, in general, it's when you have a lot of a attempts with a small probability. This would be the case when $z$ is large and $q$ is very small. This doesn't seem like a particularly natural limit for this problem as we'd imagine moderate sized $z$ and $q$ being of interest. An answerer Bob at the other site says something to the contrary but they're confusing the trials of the nonce for the trials to get the next block. The former process is actually a (not negative) binomial distribution which is a poisson process to a good approximation.
What Satoshi does is assume that the honest blocks take exactly their average time $$T_z = z/lambda_p$$ to get the $z$ blocks. Then since we're still assuming the attacker generates blocks in a Poisson process, the number of blocks the attacker creates in that time is Poisson distributed with mean $$lambda_q T_z = frac{lambda_qz}{lambda_p} = frac{q}{p}z.$$
But it seems a questionable approximation to suddenly have the honest nodes be deterministic while the attacker still has fluctuations. Of course, there will be a limit where this is effectively true, and it's in exactly the same limit as where the negative binomial above can be approximated as a Poisson.
answered Jul 13 '17 at 0:43
spaceisdarkgreenspaceisdarkgreen
33.7k21753
33.7k21753
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2356763%2fthe-probability-behind-bitcoin%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown