# the change (or lack thereof) in the probability of something inevitable occurring as time goes on

393 views

(I wasn’t able to find this specific question, so sorry if it’s been asked before)
Earlier today, I was told I would get a call between 5:00 and 6:00. If we make the assumption that this call is, for certain, going to occur within the specified time frame, I was under the assumption that every minute that passed would result in an increase in probability of the call would happen. For example, I was thinking that for the first minute the chance I get the call are 1/60, while the second minute would be 1/59 (as there are now only 59 minutes in which it could happen) and so on until the last minute is 1/1 (if I had still not received the call).
However, I was thinking back to my Freshman statistics course where my professor was talking about how the chances of the event happening at any given moment are the same, regardless of what time it is. In this case, at any given minute, the chance of me getting a call would still be 1/60. Unfortunately, I don’t remember any of the terminology that goes with these concepts, so I can’t remember if that actually applies to this problem.
So, in short, does the probability of something presumably inevitable increase as time passes? Or is the chance the same at any given moment?

In: Mathematics

It’s the same scenario as if you had 60 marbles in a pouch and one of them was red. If you draw one at a time and don’t put them back in, you can see that your chances of drawing the red one, assuming you haven’t already, are one in however many are left.

Each marble has the same chance of being red, but you stop playing as soon as you get the red one, which prunes the decision tree as you go

there are dependent and independent variables.

A coinflip is independent, each coinflip does not depend on the previous one, so the odds of getting a heads after 10 tails is still 50%

A bag of marbles is dependent, each marble you pull out reduces the number of marbles in the bag. So if there was a bag of 11 marbles, 10 red, and 1 blue, there is a 1/11 chance of pulling a blue marble on the first pull, but a 100% chance of pulling it on the last pull if you previously pulled 10 reds.

The phone example is more like a bag of marbles. at the beginning of the hour, its 1/60 for any given minute, but as the minutes wear on with no call, the probability increases for each future minute because getting a call is dependent on not having already got a call.

In this case, since you have a window of time that the call *must* happen in, your initial thought is correct and at 5:59 if you haven’t gotten the call yet you know 100% it’s happening that minute.

If you only know that there is a 1/60 chance the professor will call you any given minute (no window where they must call you), then no matter how much time has passed it is still a 1/60 chance any given minute. Thinking it’s been so long there must be a greater chance now is the gambler’s fallacy (e.g. flipping a fair coin and getting heads 100 times in a row doesn’t mean that tails is due – there is still a 50/50 chance on the 101st flip).

TLDR: You and your professor are both right. You’re playing different games.

Here’s a card game version of your problem: Take a standard 52-card deck and add 8 jokers (to get 60 cards). When the clock ticks to the next minute, you turn over 1 card. When you turn over the Ace of Spades, the call arrives.

Which of these three outcomes are possible?

– (a) You never draw the Ace of Spades.
– (b) You draw the Ace of Spades once.
– (c) You draw the Ace of Spades more than once.

“Which outcomes are possible?” is a trick question: I didn’t describe the rules clearly enough yet. I could actually be describing two different games:

– Your card game: After you draw a card, you discard it. You leave all the discards in a face-up pile, and don’t clean them up until the hour has passed. In total, you shuffle the deck once, at the beginning.
– Professor’s card game: After you draw a card, you shuffle it back into the face-down draw pile. You shuffle the deck 60 times, once for each draw.

In your card game, (a) (c) are impossible. You have to draw the Ace of Spades sometime. After the 60th draw, all the cards will be face-up, and the Ace of Spades must be among them. Once the Ace of Spades is face-up in the discard pile, it’s impossible for it to show up in future draws.

In the professor’s card game, you have a shuffled full deck every time. The Ace of Spades might or might not be the top card. (a) (b) (c) are all possible in the professor’s game.

Probability increases in your game, but not in the professor’s game.

More formally, your card game is “drawing without replacement,” and the professor’s card game is “drawing with replacement.”

I think the confusing part of this is the human aspect of “a call will come between x and y”. That introduces the “bullshit” aspect of believing there is a human whose life depends on calling you between 5 and 6 and no other option is possible.

But the best way to translate this statement factoring in both statistics and human bullshit, and speaking the truth, would be “any time between 5 and 6pm has an equally likely chance of someone calling you”.

The person who needs to call you could get suddenly busy, they won’t call. They could die, they won’t call. Their phone could get stolen by Houthie rebels, they aren’t going to call. There are so many statistics breaking “reality” factors here that make this less of a statistics class statement and more of a sociology experiment.

From a statistics POV as best as can be translated is “any time between 5 and 6 pm is equally likely to be called”. Per my hyperbolic points, that likelihood could be 0%.

You can’t really ascribe hard statistics logic to human behavior AND be a **good** customer service representative.

The terminology you would be looking for is “replacement”.

Drawing marbles until you hit the odd-colored one is with replacement if you put every drawn marble back before drawing the next (which keeps the probability of an odd-colored draw constant), and without replacement if you don’t (which gradually increases the probability of an odd-colored draw).

Your example is about drawing minutes without replacement, since the call is guaranteed to come within the hour and every elapsed minute will not come up as a possible “call window” again.

It’s 5pm. The call will come in the next hour.

* What’s the chance it comes at 5:42? It’s 1 in 60.
* What’s the chance it comes at 5:03? It’s 1 in 60.
* What’s the chance it comes at 5:59? It’s 1 in 60.

It’s 5:15pm, you walk in on your friend, who you knew was waiting for the call. You can’t tell from their expression or body language whether the call has already come.

* What’s the chance it comes at 5:42? It’s 1 in 60.
* What’s the chance it came at 5:03? It’s 1 in 60.
* What’s the chance it comes at 5:59? It’s 1 in 60.

Now you ask them “did the call come?” Whatever they answer, you can now recalculate the probabilities – the *conditional probabilities*, conditioned on their answer.

Maybe they answer “yes, the call came.”

* What’s the chance it comes at 5:42? It’s 0. The call already came.
* What’s the chance it came at 5:03? It’s 1 in 15.
* What’s the chance it comes at 5:59? It’s 0. The call already came.

These are the *conditional probabilites* of the call coming in at different times, *given that* the call came before 5:15. You can calculate these with Bayes’ rule.

Or maybe they said yes. Then you want the conditional probabilites of the call coming in at different times, *given that the call did not come* before 5:15. Again, you can calculate these with Bayes’ rule.

* What’s the chance it comes at 5:42? It’s 1 in 45.
* What’s the chance it came at 5:03? It’s 0. The call didn’t come yet.
* What’s the chance it comes at 5:59? It’s 1 in 45.

*Conditional probabilities* are new probabilities that take into account new information, and can be calculated from the *a priori probabilities* using Bayes’ rule.