In a game of odds or evens, is it possible to have an advantage depending on the rules?

53 views
0

For those who are unaware (since I’m not sure how regional this is), a game of odds or evens works like this: a player calls “odd”, the other calls “even”, and then they simultaneously show each other their hands – usually just one hand per player – with a number of fingers held up (or no fingers, which counts as zero). The sum of those fingers will result in a number that is either odd or even, so one of the players wins.

I tried to research whether the game is always mathematically fair and I’ve found conflicting explanations. My question is which of these, if any, is correct:

**Explanation number one** is that it depends on the rules. If you play with one hand per player, there are six possible numbers (0 through 5), so between the two players you have 11 possible sums (0 through 10) but 36 possible combinations of fingers held up, many of which arrive at the same sum. 18 combinations are odd and 18 combinations are even, so this version of the game would supposedly be fair.

However, some versions of the game consider the sum of 0 to be a draw, and in that case, there are more valid odd combinations (18) than even combinations (17, since a 0 hand with a 0 hand no longer counts) in the probability pool. And if the players choose to use *both* hands each, the possible sums are 0 through 20 and so there is a total of 121 combinations with 61 being even, unless the zero is considered a draw, in which case it’s 60-60.

**Explanation number two** is that the rules *don’t* matter and the game is fair no matter what. According to this explanation, the results are ultimately binary: it’s either odd or even, so with the two players you have four possible combinations: an even hand with an even hand (the result is an even sum), odd with odd (the result is even), odd with even (the result is odd) and even with odd (the result is odd). So two possible even combinations, two possible odd combinations, giving each player a 50% chance no matter the amount of hands or whether or not a zero sum is considered a draw.

My question is: which explanation is correct, and if neither is, what is the explanation? I have the mathematical prowess of a concussed goldfish so I need some help with this one.

In: 7

I’m so confused by this concept. Who decides what number of fingers to hold up? How do you control for bias or cheating? What if everyone calls odd? Where is this played??

This depends on what you assume about how the other player is choosing their numbers.

A. if you assume they play any number with equal probability, then first explanation is correct

B. second expalantion is correct if you assume other player first chooses odd or even with equal probability, then chooses a specific number to show. this is not same as option A, since using two hands gives you 6 odd numbers, and 5 even ones.

if you do not want to assume, then you could find Nash Equilibrium for two-handed game, but I am too lazy to do it now. If you want to try, NE with randomizing strategies requires either player to be indifferent between winning from any number. Also, this assumes that players are playing Nash equiliibrium, which would be plausible if they play that game for a long time, or give it a lot of thought.

PS Nash Equilibrium is probably strategy B, leading to second explanation.
But when people play such game IRL, they will try to guess what their opponent will do, or try to randomize through various psychological means (like going through numbers in sequence, or not having more than 3 evens in a row), which will deviate from Nash.

If a computer was playing this then it would be number two. I am an ethicist and I wrote my dissertation on the intersection between logic and probability. I teach and lecture but also have a consulting firm where I work w the legal teams of companies to provide a comprehensive approach to vetting executive/managements ideas prior to unpacking them w/in the greater company. All that to say, I have expertise in probability and translating it to real world applications.

I shared that to share this: understanding probability in games like this does not matter. In my undergrad and through my Master’s I played poker to pay for school and my lifestyle needs. At one club there was a mandatory break every 90 minutes of cash game play when the dealers on hand was below a certain number. At this break we would play rock-paper-scissors while smoking for a $5 a hand. This one guy would crush every single time. It defied probability.

I left that city to do my Ph.D but came back several years later and he was still holding court, still winning hundreds a night at ro-sham-bo. I was talking about where i lived and my job and talking the “good ole days” w him and since I was out of town he told me he literally could play w someone four or five times and just read their hand posture coming down and know what they were going to shoot.

Unless the game is constructed around some crazy local rules, it should be a wash and random, but, if one person seems really “lucky” leave the game ASAP.

I don’t know the answer, but wanted to congratulate you on the way you set out the question. Understood the ask but yea, no idea.

I did however read something a while ago on how to win at rock-paper-scissors. Apparently we are hard wired to stick with things that work, and change things that don’t. So if your opponent just won with scissors, then statistically they are more likely to go with scissors next time than rock or paper. In the same way, if they lost with scissors, they are more likely to change.

Obviously, this won’t follow every time, but playing over a long enough timeframe will tilt the game in your favour.

I’ve never played this way. I’ve always played 1 or 2. Each player has that option. Options are 2, 3, 3, or 4. 50/50 shot.