I was thinking about lottery odds, and how so much of the pitch is, essentially, you miss 100 percent of the shots you don’t take, with the thought that you should at least enter because your odds go up so much with just one ticket. The odds were non existent before, and now they exist even if they’re vanishingly small.
Is the difference between 1 in a million and 0 in a million actually somehow more than the difference between 1 in a million and 2 in a million, or between 492,368 in a million vs 492,369 in a million? Or are all three of these functionally the same?
In: 0
Odds such as these can be generally thought of in two ways: proportional, or absolute.
Yes, absolutely speaking, 1 in a million is functionally the same as 0 in a million. While there is a distinct difference between the two, especially since it actually *could* happen in the 1 case, you’re *very* unlikely to notice any difference in a practical sense.
Proportionally, going from 1 in a million to 0 in a million is a decrease of 100%. It is *infinitely* less likely comparatively. Meanwhile stepping up to 2 in a million is an increase of 100%. 3 in a million is 200% increase from 1, and a 50% increase from 2.
Your frame of reference matters a lot, as does what the odds are being applied to.
Latest Answers