Specifically I’m asking about this situation that came up during a D&D game:
Say I’m rolling a 20 sided die, and I do not want to roll a 1. I know that the odds of rolling a 1 are 1/20.
I know that the chances of rolling a 1 twice in a row is (1/20 * 1/20), which is far a lower occurrence.
Say then, before I rolled my “real” roll, I rolled the die again and again until I landed on a 1, then proceeded to roll my “real” roll, would I have reduced the odds of rolling a 1 to (1/20 * 1/20), given that I’ve just rolled a 1 prior?
This is the logic I’m having trouble reasoning about and I’d appreciate it if anyone could clarify what is or is not accurate about the assumptions being made in this scenario.
In: Mathematics
Each individual roll stands on its own. So each time you pick up a 20 sided die, you have a 1 in 20 chance of rolling any of the numbers.
Before you started any roll, the chance of rolling one of the numbers two times in a row is pretty small. But even if you rolled twelve ones in a row, the next time you pick up your die, you still have a 1 in 20 chance in that moment. The past doesn’t alter the odds on your specific roll.
It’s just the odds of rolling 13 ones in a row is incredibly small when you haven’t rolled once yet. But after you’ve rolled 12 ones in a row? Your odds are 1 in 20.
Latest Answers