When calculating averages, why can we treat exploding die as if they're independent?Help with probabilities...
PWM on 5V GPIO pin
Furthest distance half the diameter?
Did the Byzantines ever attempt to move their capital to Rome?
Would a character take damage when surrounded by, but not in, flames?
Would scoring well on a non-required GRE Mathematics Subject Test make me more competitive?
Should I tip on the Amtrak train?
When calculating averages, why can we treat exploding die as if they're independent?
What exactly is Apple Cider
Why is it that I have to play this note on the piano as A sharp?
Capacitors with same voltage, same capacitance, same temp, different diameter?
Why would an AC motor heavily shake when driven with certain frequencies?
Why do the Brexit opposition parties not want a new election?
Why would an airport be depicted with symbology for runways longer than 8,069 feet even though it is reported on the sectional as 7,200 feet?
Features seen on the Space Shuttle's solid booster; what does "LOADED" mean exactly?
antimatter annihilation in stars
Is a MySQL database a viable alternative to LDAP?
What makes an ending "happy"?
How to convert P2O5 concentration to H3PO4 concentration?
Infinitely many primes
Is it right to use the ideas of non-winning designers in a design contest?
Why is infinite intersection "towards infinity" an empty set?
What is the purpose of the rotating plate in front of the lock?
Compiler optimization of bitwise not operation
Does the word voltage exist in academic engineering?
When calculating averages, why can we treat exploding die as if they're independent?
Help with probabilities on a game I am makingExpected number of tosses before you see a repeat.When rolling multiple dice, what is the average value if you only keep the highest roll?Probability that rolling X dice with Y sides and summing the highest Z values is above some value kAn $m$ sided dice is rolled $n$ times what is the chance of getting an average of $frac{m+1}{2}$?Probability of succesful rolls of different sided diesHow to handle dice probability? ie, how much more likely would 3 six sided dice give a higher sum than 3 four sided dice?Exploding (a.k.a open-ended) dice poolProbability that any outcome of a dice roll happens more than X times out of Y trialsExpected value of the sum of three different die
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
$begingroup$
I'm certain that I'm forgetting something basic, but here goes. The exploding dice is a (house) rule for some games that says when you roll the maximum result on a given die (e.g. 6 on a six-sided die) you roll that die again and add the result. If you roll the maximum value again, you roll again and add that. This will carry on until you stop rolling the maximum value.
From here, a natural question arises: "what's the average of an exploding die?". With the example of a six-sided die, the following answer comes naturally:
$3.5*+3.5*frac{1}{6}+3.5*frac{1}{6^2}dots=4.2$
This does indeed seem to be correct, and holds to any empirical test that I can think of, but why does this work? I want to use some excuse to the effect of "expected value is linear and we've got identical distributions", but I find that unsatisfactory. In particular, I don't understand why we can use the average values of 3.5 when every term to the right of that 3.5 assumes that we've beat the average. I have no doubt that this is why we need the $6^{-n}$ terms, but my intuition insists that this is insufficient.
Note: What I really want here is to see the rigor. An ideal answer will attack this from the ground up, possibly even axiomatically. I'd hope that we don't have to go as deep as using probability measures on sets, but at the very least I want some answer that focuses on what property of averages allows us to factor the dice like this.
probability dice average
$endgroup$
add a comment |
$begingroup$
I'm certain that I'm forgetting something basic, but here goes. The exploding dice is a (house) rule for some games that says when you roll the maximum result on a given die (e.g. 6 on a six-sided die) you roll that die again and add the result. If you roll the maximum value again, you roll again and add that. This will carry on until you stop rolling the maximum value.
From here, a natural question arises: "what's the average of an exploding die?". With the example of a six-sided die, the following answer comes naturally:
$3.5*+3.5*frac{1}{6}+3.5*frac{1}{6^2}dots=4.2$
This does indeed seem to be correct, and holds to any empirical test that I can think of, but why does this work? I want to use some excuse to the effect of "expected value is linear and we've got identical distributions", but I find that unsatisfactory. In particular, I don't understand why we can use the average values of 3.5 when every term to the right of that 3.5 assumes that we've beat the average. I have no doubt that this is why we need the $6^{-n}$ terms, but my intuition insists that this is insufficient.
Note: What I really want here is to see the rigor. An ideal answer will attack this from the ground up, possibly even axiomatically. I'd hope that we don't have to go as deep as using probability measures on sets, but at the very least I want some answer that focuses on what property of averages allows us to factor the dice like this.
probability dice average
$endgroup$
add a comment |
$begingroup$
I'm certain that I'm forgetting something basic, but here goes. The exploding dice is a (house) rule for some games that says when you roll the maximum result on a given die (e.g. 6 on a six-sided die) you roll that die again and add the result. If you roll the maximum value again, you roll again and add that. This will carry on until you stop rolling the maximum value.
From here, a natural question arises: "what's the average of an exploding die?". With the example of a six-sided die, the following answer comes naturally:
$3.5*+3.5*frac{1}{6}+3.5*frac{1}{6^2}dots=4.2$
This does indeed seem to be correct, and holds to any empirical test that I can think of, but why does this work? I want to use some excuse to the effect of "expected value is linear and we've got identical distributions", but I find that unsatisfactory. In particular, I don't understand why we can use the average values of 3.5 when every term to the right of that 3.5 assumes that we've beat the average. I have no doubt that this is why we need the $6^{-n}$ terms, but my intuition insists that this is insufficient.
Note: What I really want here is to see the rigor. An ideal answer will attack this from the ground up, possibly even axiomatically. I'd hope that we don't have to go as deep as using probability measures on sets, but at the very least I want some answer that focuses on what property of averages allows us to factor the dice like this.
probability dice average
$endgroup$
I'm certain that I'm forgetting something basic, but here goes. The exploding dice is a (house) rule for some games that says when you roll the maximum result on a given die (e.g. 6 on a six-sided die) you roll that die again and add the result. If you roll the maximum value again, you roll again and add that. This will carry on until you stop rolling the maximum value.
From here, a natural question arises: "what's the average of an exploding die?". With the example of a six-sided die, the following answer comes naturally:
$3.5*+3.5*frac{1}{6}+3.5*frac{1}{6^2}dots=4.2$
This does indeed seem to be correct, and holds to any empirical test that I can think of, but why does this work? I want to use some excuse to the effect of "expected value is linear and we've got identical distributions", but I find that unsatisfactory. In particular, I don't understand why we can use the average values of 3.5 when every term to the right of that 3.5 assumes that we've beat the average. I have no doubt that this is why we need the $6^{-n}$ terms, but my intuition insists that this is insufficient.
Note: What I really want here is to see the rigor. An ideal answer will attack this from the ground up, possibly even axiomatically. I'd hope that we don't have to go as deep as using probability measures on sets, but at the very least I want some answer that focuses on what property of averages allows us to factor the dice like this.
probability dice average
probability dice average
asked 8 hours ago
J. MiniJ. Mini
1951 silver badge7 bronze badges
1951 silver badge7 bronze badges
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
This does indeed come from linearity of expectation - but you have to be really careful about what exactly you're applying this theorem to. In particular, let's examine some random variables. In a given trial (i.e. you roll the dice until you get something other than $6$), let us define some quantities. First, let $X$ be the total achieved. Let $X_1$ be the portion of this due to the first roll and $X_2$ be the portion due to the second roll (which is $0$ if there was no second roll) and so on.
We then have that $X=X_1+X_2+X_3+ldots$ noting that, almost certainly, there are only finitely many non-zero terms in the sum and also - in case we should later worry about issues of convergence - that these are all non-negative quantities, so we are justified in applying linearity of expectations to this to get
$$mathbb E[X]=mathbb E[X_1]+mathbb E[X_2]+ldots$$
Then, we just compute $mathbb E[X_n]$. This is straightforwards: There is a $frac{1}{6^{n-1}}$ chance that we will roll for an $n^{th}$ time and, given that we do roll, the expected roll is $3.5$ as it is just a typical die roll. So, $mathbb E[X_n]=frac{1}{6^{n-1}}cdot 3.5$ as given in the solution which gives
$$mathbb E[X]=(1+1/6+1/6^2+1/6^3+ldots)cdot 3.5$$
Note that, via this approach, we never consider whether a die actually was $6$ except in determining whether we reach the $n^{th}$ roll - that's because, to compute expectation, we are splitting into the cases of "I roll this die" and "I don't roll this die" which do not bias the roll of the die at all. Basically, we are allowed to imagine, while computing each expectation, that this is the last roll, regardless of whether we get a $6$ because no further information is relevant to the value of $X_n$.
$endgroup$
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
add a comment |
$begingroup$
Another way to look at it:
Let $E$ denote the answer. Suppose you toss the die once. One of two things happens..either you get a value below $6$ or you get a $6$ and start over (from which point, of course, you expect to get an additional $E$). Thus we have $$E=frac 16times (1+2+3+4+5)+frac 16times (6+E)implies E=frac {21}5$$
as desired.
$endgroup$
add a comment |
$begingroup$
The series you have there represents
The expected value of the first die throw, plus the probability that you get a second throw times the expected value of the second die, plus the probability that you get a third throw times the expected value of the third throw, plus ...
It is basically what you get if you write down what the expectation is straight from the definition, and tidy up a little:
$$
frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ cdots right) right)
$$
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3347575%2fwhen-calculating-averages-why-can-we-treat-exploding-die-as-if-theyre-independ%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
This does indeed come from linearity of expectation - but you have to be really careful about what exactly you're applying this theorem to. In particular, let's examine some random variables. In a given trial (i.e. you roll the dice until you get something other than $6$), let us define some quantities. First, let $X$ be the total achieved. Let $X_1$ be the portion of this due to the first roll and $X_2$ be the portion due to the second roll (which is $0$ if there was no second roll) and so on.
We then have that $X=X_1+X_2+X_3+ldots$ noting that, almost certainly, there are only finitely many non-zero terms in the sum and also - in case we should later worry about issues of convergence - that these are all non-negative quantities, so we are justified in applying linearity of expectations to this to get
$$mathbb E[X]=mathbb E[X_1]+mathbb E[X_2]+ldots$$
Then, we just compute $mathbb E[X_n]$. This is straightforwards: There is a $frac{1}{6^{n-1}}$ chance that we will roll for an $n^{th}$ time and, given that we do roll, the expected roll is $3.5$ as it is just a typical die roll. So, $mathbb E[X_n]=frac{1}{6^{n-1}}cdot 3.5$ as given in the solution which gives
$$mathbb E[X]=(1+1/6+1/6^2+1/6^3+ldots)cdot 3.5$$
Note that, via this approach, we never consider whether a die actually was $6$ except in determining whether we reach the $n^{th}$ roll - that's because, to compute expectation, we are splitting into the cases of "I roll this die" and "I don't roll this die" which do not bias the roll of the die at all. Basically, we are allowed to imagine, while computing each expectation, that this is the last roll, regardless of whether we get a $6$ because no further information is relevant to the value of $X_n$.
$endgroup$
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
add a comment |
$begingroup$
This does indeed come from linearity of expectation - but you have to be really careful about what exactly you're applying this theorem to. In particular, let's examine some random variables. In a given trial (i.e. you roll the dice until you get something other than $6$), let us define some quantities. First, let $X$ be the total achieved. Let $X_1$ be the portion of this due to the first roll and $X_2$ be the portion due to the second roll (which is $0$ if there was no second roll) and so on.
We then have that $X=X_1+X_2+X_3+ldots$ noting that, almost certainly, there are only finitely many non-zero terms in the sum and also - in case we should later worry about issues of convergence - that these are all non-negative quantities, so we are justified in applying linearity of expectations to this to get
$$mathbb E[X]=mathbb E[X_1]+mathbb E[X_2]+ldots$$
Then, we just compute $mathbb E[X_n]$. This is straightforwards: There is a $frac{1}{6^{n-1}}$ chance that we will roll for an $n^{th}$ time and, given that we do roll, the expected roll is $3.5$ as it is just a typical die roll. So, $mathbb E[X_n]=frac{1}{6^{n-1}}cdot 3.5$ as given in the solution which gives
$$mathbb E[X]=(1+1/6+1/6^2+1/6^3+ldots)cdot 3.5$$
Note that, via this approach, we never consider whether a die actually was $6$ except in determining whether we reach the $n^{th}$ roll - that's because, to compute expectation, we are splitting into the cases of "I roll this die" and "I don't roll this die" which do not bias the roll of the die at all. Basically, we are allowed to imagine, while computing each expectation, that this is the last roll, regardless of whether we get a $6$ because no further information is relevant to the value of $X_n$.
$endgroup$
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
add a comment |
$begingroup$
This does indeed come from linearity of expectation - but you have to be really careful about what exactly you're applying this theorem to. In particular, let's examine some random variables. In a given trial (i.e. you roll the dice until you get something other than $6$), let us define some quantities. First, let $X$ be the total achieved. Let $X_1$ be the portion of this due to the first roll and $X_2$ be the portion due to the second roll (which is $0$ if there was no second roll) and so on.
We then have that $X=X_1+X_2+X_3+ldots$ noting that, almost certainly, there are only finitely many non-zero terms in the sum and also - in case we should later worry about issues of convergence - that these are all non-negative quantities, so we are justified in applying linearity of expectations to this to get
$$mathbb E[X]=mathbb E[X_1]+mathbb E[X_2]+ldots$$
Then, we just compute $mathbb E[X_n]$. This is straightforwards: There is a $frac{1}{6^{n-1}}$ chance that we will roll for an $n^{th}$ time and, given that we do roll, the expected roll is $3.5$ as it is just a typical die roll. So, $mathbb E[X_n]=frac{1}{6^{n-1}}cdot 3.5$ as given in the solution which gives
$$mathbb E[X]=(1+1/6+1/6^2+1/6^3+ldots)cdot 3.5$$
Note that, via this approach, we never consider whether a die actually was $6$ except in determining whether we reach the $n^{th}$ roll - that's because, to compute expectation, we are splitting into the cases of "I roll this die" and "I don't roll this die" which do not bias the roll of the die at all. Basically, we are allowed to imagine, while computing each expectation, that this is the last roll, regardless of whether we get a $6$ because no further information is relevant to the value of $X_n$.
$endgroup$
This does indeed come from linearity of expectation - but you have to be really careful about what exactly you're applying this theorem to. In particular, let's examine some random variables. In a given trial (i.e. you roll the dice until you get something other than $6$), let us define some quantities. First, let $X$ be the total achieved. Let $X_1$ be the portion of this due to the first roll and $X_2$ be the portion due to the second roll (which is $0$ if there was no second roll) and so on.
We then have that $X=X_1+X_2+X_3+ldots$ noting that, almost certainly, there are only finitely many non-zero terms in the sum and also - in case we should later worry about issues of convergence - that these are all non-negative quantities, so we are justified in applying linearity of expectations to this to get
$$mathbb E[X]=mathbb E[X_1]+mathbb E[X_2]+ldots$$
Then, we just compute $mathbb E[X_n]$. This is straightforwards: There is a $frac{1}{6^{n-1}}$ chance that we will roll for an $n^{th}$ time and, given that we do roll, the expected roll is $3.5$ as it is just a typical die roll. So, $mathbb E[X_n]=frac{1}{6^{n-1}}cdot 3.5$ as given in the solution which gives
$$mathbb E[X]=(1+1/6+1/6^2+1/6^3+ldots)cdot 3.5$$
Note that, via this approach, we never consider whether a die actually was $6$ except in determining whether we reach the $n^{th}$ roll - that's because, to compute expectation, we are splitting into the cases of "I roll this die" and "I don't roll this die" which do not bias the roll of the die at all. Basically, we are allowed to imagine, while computing each expectation, that this is the last roll, regardless of whether we get a $6$ because no further information is relevant to the value of $X_n$.
answered 7 hours ago
Milo BrandtMilo Brandt
41.1k5 gold badges80 silver badges141 bronze badges
41.1k5 gold badges80 silver badges141 bronze badges
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
add a comment |
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
1
1
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
$begingroup$
You've got it! The key part here is what "the portion due to the second roll" means. When you think about the probability distribution, you can see that X_2 is 1 with probability 1/36, 2 with probability 1/36, and so on until you see that it's 0 with probability 30/36 (i.e. 5/6). From here, the average of each X_n is easy to work out and it really just is linearity of expectation.
$endgroup$
– J. Mini
7 hours ago
add a comment |
$begingroup$
Another way to look at it:
Let $E$ denote the answer. Suppose you toss the die once. One of two things happens..either you get a value below $6$ or you get a $6$ and start over (from which point, of course, you expect to get an additional $E$). Thus we have $$E=frac 16times (1+2+3+4+5)+frac 16times (6+E)implies E=frac {21}5$$
as desired.
$endgroup$
add a comment |
$begingroup$
Another way to look at it:
Let $E$ denote the answer. Suppose you toss the die once. One of two things happens..either you get a value below $6$ or you get a $6$ and start over (from which point, of course, you expect to get an additional $E$). Thus we have $$E=frac 16times (1+2+3+4+5)+frac 16times (6+E)implies E=frac {21}5$$
as desired.
$endgroup$
add a comment |
$begingroup$
Another way to look at it:
Let $E$ denote the answer. Suppose you toss the die once. One of two things happens..either you get a value below $6$ or you get a $6$ and start over (from which point, of course, you expect to get an additional $E$). Thus we have $$E=frac 16times (1+2+3+4+5)+frac 16times (6+E)implies E=frac {21}5$$
as desired.
$endgroup$
Another way to look at it:
Let $E$ denote the answer. Suppose you toss the die once. One of two things happens..either you get a value below $6$ or you get a $6$ and start over (from which point, of course, you expect to get an additional $E$). Thus we have $$E=frac 16times (1+2+3+4+5)+frac 16times (6+E)implies E=frac {21}5$$
as desired.
answered 8 hours ago
lulululu
47.1k2 gold badges53 silver badges86 bronze badges
47.1k2 gold badges53 silver badges86 bronze badges
add a comment |
add a comment |
$begingroup$
The series you have there represents
The expected value of the first die throw, plus the probability that you get a second throw times the expected value of the second die, plus the probability that you get a third throw times the expected value of the third throw, plus ...
It is basically what you get if you write down what the expectation is straight from the definition, and tidy up a little:
$$
frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ cdots right) right)
$$
$endgroup$
add a comment |
$begingroup$
The series you have there represents
The expected value of the first die throw, plus the probability that you get a second throw times the expected value of the second die, plus the probability that you get a third throw times the expected value of the third throw, plus ...
It is basically what you get if you write down what the expectation is straight from the definition, and tidy up a little:
$$
frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ cdots right) right)
$$
$endgroup$
add a comment |
$begingroup$
The series you have there represents
The expected value of the first die throw, plus the probability that you get a second throw times the expected value of the second die, plus the probability that you get a third throw times the expected value of the third throw, plus ...
It is basically what you get if you write down what the expectation is straight from the definition, and tidy up a little:
$$
frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ cdots right) right)
$$
$endgroup$
The series you have there represents
The expected value of the first die throw, plus the probability that you get a second throw times the expected value of the second die, plus the probability that you get a third throw times the expected value of the third throw, plus ...
It is basically what you get if you write down what the expectation is straight from the definition, and tidy up a little:
$$
frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ frac16cdot1+cdots+frac16cdot 5+frac16cdot left(6+ cdots right) right)
$$
edited 7 hours ago
answered 7 hours ago
ArthurArthur
138k9 gold badges129 silver badges223 bronze badges
138k9 gold badges129 silver badges223 bronze badges
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3347575%2fwhen-calculating-averages-why-can-we-treat-exploding-die-as-if-theyre-independ%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown