I was thinking about lottery odds, and how so much of the pitch is, essentially, you miss 100 percent of the shots you don’t take, with the thought that you should at least enter because your odds go up so much with just one ticket. The odds were non existent before, and now they exist even if they’re vanishingly small.
Is the difference between 1 in a million and 0 in a million actually somehow more than the difference between 1 in a million and 2 in a million, or between 492,368 in a million vs 492,369 in a million? Or are all three of these functionally the same?
In: 0
>What’s the difference between something with 1 in a million odds and 0 in a million odds?
0 in a million : impossible to win
1 in a million: you can win but rarely.
>Is the difference between 1 in a million and 0 in a million actually somehow more than the difference between 1 in a million and 2 in a million, or between 492,368 in a million vs 492,369 in a million? Or are all three of these functionally the same?
They are different.
The best way to look at these things is in terms of their proportions
Going from 1 in a million to 2 in a million is 100% better odds (2 times better)
Going from 492,368 in a million to 492,369 in a million is a 0.0002% better odds. (Basically no change in odds)
Going from 0 in a million to 1 in a million is an incalculable increase in odds (in some senses, more than infinity times better).
0.1 in a million to 1 in a million is 10 times the odds.
0.01 in a million to 1 in a million is 100 times the odds.
0.000001 in a million to 1 in a million is 1000000 times the odds.
Now as the 0.000…001 gets closer to 0, the odds increase when compared to 1 gets closer and closer to infinity.
However, 0 times infinity is usually undefined. So you can’t multiply 0 by infinity to get 1. (This is why I say in some senses it is better than an infinite times better odds)
Latest Answers