If there are two boxes. The first has a 100$ bill and a 1$ bill, and the second has two 100$ bills. If I puck a random box and take out a 100$ bill, whaat is the chance of me taking out another 100$ bill?

754 views

I’m honestly stuck. I’ve seen people say 1/2, others 2/3. Something Monty Hall Problem, Bayes Theorem but I’m still confused so here I am.

Edit: I believe you are not allowed to change your box choice on the 2nd “turn” as that would make having two boxes pointless, wouldn’t it?

In: 10

24 Answers

Anonymous 0 Comments

Before I explain the solution to the problem, let’s see a different problem with dice, which are easier to understand.

You’ve got a regular six-faced fair die numbered from 1 to 6. What is the probability of getting a 3? When you want to calculate a probability of equally-likely outcomes, there’s a formula, which goes:

* Probability = **Number of desired outcomes** ÷ **Number of possible outcomes**.

When rolling a die, there are **six** possible outcomes, but getting a 3, the desired outcome, is only **one** of them. In this case, the probability would be **1/6**. Similarly, what is the probability of getting either a 4 or a 6? The number of possible outcomes is still **six**, but now there are **two** desired outcomes, so this probability is **2/6** (or 1/3).

What about this: “After rolling a die, the result is *at least 5*. What is the probability that the result was a 6?” We can still use the formula, but we have to be careful. What is the number of possible outcomes? If you say **six**, that’s incorrect, because **we know** that *at least 5*, which means that the die was either a 5 or a 6. Those are the only possible outcomes if the result is at least five, so the number of possible outcomes in this case is **two**. What about the number of desired outcomes? Well, the desired outcome was getting a 6, so only **one** desired outcome. The probability is, then, **1/2**. And it makes perfect sense: if you roll a die and get at least a 5, you either got a 5 or a 6, each equally likely with a 1/2 probability.

Let’s go back to the original problem, which I will restate to make it more clear:

> There are two boxes, box A and box B. In box A there is one $1 bill and one $100 bill. In box B there are two $100 bills.
> You now carry the following experiment: first, choose a box at random, and then pick a bill from said box. Suppose that the result of this experiment is that the bill that you got is a $100 bill. What is the probability that the chosen box was box B?

I changed a bit the problem to make it easier: you asked “what is the chance of taking out another $100 bill”, but that is the same as asking what is the probability that the box that you chose was box B.

What are the possible outcomes of the experiment (without taking into account that we know the result)? Let’s enumerate them:

1. You choose box A and pick the $1 bill.
2. You choose box A and pick the $100 bill.
3. You choose box B and pick one of the $100 bill.
4. You choose box B and pick the other $100 bill.

Each of these outcomes are equally likely (like rolling a die). If I asked: what is the probability of picking a $100 bill? Then, the answer would be: there are **four** possible outcomes, of which **three** of them are desirable, so the probability is **3/4**.

The problem asks: “the picked bill is a $100 bill. What is the probability that the chosen box was box B?” What are the possible outcomes? Even if our experiment has **four** possible outcomes (rolling a die has six outcomes), if we know that we picked a $100 bill, only **three** of them are possible (when rolling a die, *at least five* means two possible outcomes ). Out of those three, how many are desirable? One of them comes from choosing box A, and two of them from choosing box B. Thus, the number of desirable outcomes is **two**. The probability is **2/3**.

Anonymous 0 Comments

If the rules are such that you are guaranteed to get a 100 first try, then you have the monty hall situation, otherwise you have probability 0.5

Anonymous 0 Comments

When selecting a box you only have two choices. But by taking out one note at a time each choices is split again into to options. Resulting is having four choices:

1. first $1, second $100,
2. first $100, second $1,
3. first $100, second $100,
4. first $100, second $100 (reverse of 3)

After selecting a box and before taking a note, all four options are equal. But after selecting $100 note first, only the three options (2, 3, 4) are possible. Therefore 2 out of the remaining 3, options means you have the box with $200. Resulting in 2/3 (not 1/2)

If the reverse happened, and you selected the $1 first. You have 100% chance of the box with $101, no one would argue that you still have a 50/50 (or 1/2) chance because that is the chance you started with.

Anonymous 0 Comments

The $ goes before the numbers. $100.

Anonymous 0 Comments

The probability is two thirds.

Let’s label the 100$ in the first box as A, and the two 100$ notes in the second box as B and C.

When you pick the first note, you have four equally likely cases: $1, $100A, $100B, $100C.

If you got a $100 note, you can be in one of three possible cases:

* You picked A, and the other note in the box is $1
* You picked B, and the other note in the box is C
* You picked C, and the other note in the box is B

The counter-intuitive part of this whole thing is that all the notes look the same so you can’t tell B and C apart.

If you phrase the question differently, it’s a bit easier to get the right intuition: You have a box with a 1Direction album and an Apple, and a box with a Banana and a Cabbage. If you got one of the Apple, Banana, or Cabbage from the first draw, what’s the likelihood of getting either a Banana or Cabbage from the second draw?

* You picked A(pple), and there’s a 1(Direction) in the box
* You picked B(banana), and there’s a C(cabbage) in the box
* You picked C(abbage), and there’s a B(anana) in the box.

Another way to look at it is: You have one single bag with a $1 and three $100. Two of the $100 notes have “get another $100” written on them, and the other has “get $1” written on it.

Anonymous 0 Comments

It would really help if the question was specified precisely. It’s highly ambiguous at the moment. This is how I interpret the question.

You pick one box at random. You take one note from that box at random. What is the probability that the other note in that same box is a $100 bill?

Another comment spells out the possibilities nicely:

>1:You pick the box with a $1 and $100 bill and draw a $1 bill

2:You pick the box with a $1 and $100 bill and draw a $100 bill

3:You pick the box with two $100 bills and draw the first $100 bill

4:You pick the box with two $100 bills and draw the second $100 bill

We know that 1. is not possible since you picked a £100 bill, so there are only 3 paths you could be on. In one of them (2), the other bill will be £1. In two of them (3 and 4) the second bill will be £100. Therefore the probability is 2/3.

This is an example of conditional probability using Bayes’ theorem, but in this case it’s simple enough to explain without the formula.

Anonymous 0 Comments

Chance for each individual bill: 25%

Chance for picking a $100 bill: 75%

Chance for the $100 bill to come from the 200 box: 50%

Chance for the $100 bill to come from the 101 box: 25%

50%/75% to 25%/75%

-> 2/3 to 1/3 chance for the other bill to also be a $100 bill

Anonymous 0 Comments

If you’re allowed to pick a box again, you have a 2/3 chance of getting another $100 bill. Like you said, in this scenario the boxes are pointless.

If you have to stay with the same box you already picked, it’s simpler, you have a 50-50 chance. The only important choice you made was which box to pick. The fact that you already drew $100 bill out of that box is irrelevant. It doesn’t change the odds of you having picked the box with 2 $100 bills in it

Anonymous 0 Comments

Isn’ it 1/2 since it’s either 100 or 1?

Anonymous 0 Comments

Let’s say before you start they have drawn a red dot on the $100 bill going in with the $1, a blue dot on one of the other $100 bills, and a green dot on the last one. So the blue and the green are together in one box. (Let’s say the dots are written in invisible ink and are only visible after the whole experiment is over, but this gives us a way to talk about each bill separately.)

Before you pick anything, there are four possibilities:
A) $1 then $100 (red)
B) $100 (red) then $1
C) $100 (blue) then $100 (green)
D) $100 (green) then $100 (blue).

But you drew a $100 bill first. You don’t know what color the invisible dot is, so you might have the red, green, or blue $100 bill in your hand. But in any case, you’re clearly not in scenario (A).

Can we narrow it down at all between (B), (C), and (D)? No. There’s no way to know. You might have the red, the blue, or the green in your hand. They are all equally likely.

But in two of those scenarios, (C) and (D), you’re going to draw another $100 out, while in one of them your second bill is going to be a $1. So there’s a 2 in 3 chance you’ll get $100 and a 1 in 3 chance you’ll get $1.