Board Game Designers Forum - Comments for "The Monty Hall Problem"
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem
Comments for "The Monty Hall Problem"en55cards wrote:DogBoy
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem#comment-19619
<p><div class="quote-msg"><div class="quote-author"><em>55cards</em> wrote:</div><div class="quote-msg"><div class="quote-author"><em>DogBoy</em> wrote:</div><br />
I'm not sure if that's what the previous poster intended, but those numbers are correct.<br />
The 2 red cards have 25% chance, and the unchosen blue card has 50% chance.<br />
</div></p>
<p>It wasn't what I intended, but the numbers ARE indeed correct! (Took me a while to realise it though.) Just like in the original 3-card situation, the player is swapping his initial 1 card for (effectively) 2 cards, hence doubling his chances. In the 3-card situation this means going from 33% to 66%, in the 4-card/2-colour situation this means going from 25% to 50% (the extra card doesn't change the game, but it does reduce the absolute, but not relative, percentages).</p>
<p>Think of it this way (this is how I rationalised it!): If the player picks a red card, the opponent must choose the blue cards. Now, whether or not the opponent flips a card the choice for the player is the same: Keep you red card or switch and get *both* the blue cards. That's the effective outcome for the player: swapping one card for two (the fact that the opponent may helpfully reveal a blank blue card doesn't change the equation). As I say, that is of course exactly the choice given to the player in the classic 3-card scenario. The 4th card changes the percentages, but not the rationale of the player's choice.</p>
<p>Thanks for proposing this twist on the problem — it's an interesting thought experiment! — and report back if you find a way to incorporate it into a game prototype. </div></p>
<p>Here's one way that I'm working on incorporating. Player A wants a particular card. They select the card. Player B chooses 2 other cards that presumably Player A doesn't want.</p>
<p>Player B randomizes the cards and then places them face down. Player A chooses a card, and then player B flips one of the other cards face up. Player B may then sets a resource cost for Player A to change his or her selection (possibly within a range).</p>
<p>Depending on the need for Player A to get the card he's looking for, he may or may not want to pay the resource cost to switch. Especially since Player A knows that Player B knows where the desired card is.</p>
<p>Player B gets some interesting choices out of this interaction, because they may not want to set the price so high that Player A won't pay it - if they want to get resources, however they may ALSO want to bait Player A into making a specific choice (to either stay or switch). Which will influence their decision to set the price.</p>
<p>Potentially, Player A could switch, for free, to the flipped over card. Presumably this would be free (or even gain him a resource payout) to compensate for the card being less useful to him (or even detrimental). </p>
<p>-OR- Player B might be have to take X resources from a total of Y resources. Player A can keep his choice for free, pay X to switch to the unflipped other card, to GAIN Y-X resources to take the flipped card. Like Player B has to assign 10 resources to the 2 unflipped cards. She assigns a value of 3 for Player A to buy the unflipped card, and a value of 7 that Player A would gain by taking the flipped card.</p>
<p>Obviously this is all abstract, and would require a good game for this to work. But it's at least some interesting interactions.</p>
Fri, 29 Jul 2011 20:47:33 +0000Yamahakocomment 19619 at https://www.bgdf.comNow I get it...
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem#comment-19596
<p><div class="quote-msg"><div class="quote-author"><em>DogBoy</em> wrote:</div><br />
I'm not sure if that's what the previous poster intended, but those numbers are correct.<br />
The 2 red cards have 25% chance, and the unchosen blue card has 50% chance.<br />
</div></p>
<p>It wasn't what I intended, but the numbers ARE indeed correct! (Took me a while to realise it though.) Just like in the original 3-card situation, the player is swapping his initial 1 card for (effectively) 2 cards, hence doubling his chances. In the 3-card situation this means going from 33% to 66%, in the 4-card/2-colour situation this means going from 25% to 50% (the extra card doesn't change the game, but it does reduce the absolute, but not relative, percentages).</p>
<p>Think of it this way (this is how I rationalised it!): If the player picks a red card, the opponent must choose the blue cards. Now, whether or not the opponent flips a card the choice for the player is the same: Keep you red card or switch and get *both* the blue cards. That's the effective outcome for the player: swapping one card for two (the fact that the opponent may helpfully reveal a blank blue card doesn't change the equation). As I say, that is of course exactly the choice given to the player in the classic 3-card scenario. The 4th card changes the percentages, but not the rationale of the player's choice.</p>
<p>Thanks for proposing this twist on the problem — it's an interesting thought experiment! — and report back if you find a way to incorporate it into a game prototype.</p>
Thu, 28 Jul 2011 22:54:24 +000055cardscomment 19596 at https://www.bgdf.comYamahako wrote:
55cards
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem#comment-19534
<p><div class="quote-msg"><div class="quote-author"><em>Yamahako</em> wrote:</div></p>
<p><div class="quote-msg"><div class="quote-author"><em>55cards</em> wrote:</div>2. I'm confident that the analysis of your last example, where the doors are coloured in pairs, is incorrect. When the opponent flips over a card of the opposite colour, no new information about the position of the 'good' card is introduced. Since there is only one good card and two cards of each colour, the opponent can always choose to reveal a undesirable card of the opposite colour to the player's initial choice, and so by doing so reveals nothing new. The situation in this example is exactly the same then, as your second example, and the player should always switch and may pick a card of either colour when he does so. Provided he switches his chance of success goes from 25% to 37.5%, and this is regardless of the colour of the card he chooses. In a sense you can split the player's objective into two independent goals: trying to get the guess where the location of the money (initially 1 out of 4), and trying to guess the colour of money card (1 out of 2: red or blue). When the opponent reveals an undesirable card, the player can switch to improve his chances of guessing the location, but the chances that his initial guess about the colour were correct remain exactly the same, so switching makes no difference since it's 50:50 either way.</div></p>
<p>Well that's what I'm wondering...</p>
<p>If you look at it this way. There's a 50% chance that the card is a red backed card, and a 50% chance that the card is a blue backed card. I choose a red card. There's a 25% chance it's the card I chose. My opponent looks at the blue cards, and then flips over a "goat". That means the unflipped blue card has a 50% chance to be the "car" doesn't it?</p>
<p>Now, the unchosen red card is where (to me) it get's confusing. The card I picked doesn't have improved chances of being the "car" But the card I didn't pick previously now has a 1 in 3 chance of being the car..</p>
<p>I know its counter-intuitive... especially since the percentages all add up. But the other red card just can't really be 25% chance can it? I'm also wondering if the new chances would change to be 50% - blue card, 37.5% - unchosen red card, 12.5% - chosen red card. But it also doesn't make sense that the originally chosen card has a lower then 25% chance.</p>
<p>If I'm understanding your response correctly, you feel the 2 red cards have 25% chance, and the unchosen blue card has a 50% chance? Is that correct?<br />
</div></p>
<p>I'm not sure if that's what the previous poster intended, but those numbers are correct.<br />
The 2 red cards have 25% chance, and the unchosen blue card has 50% chance.</p>
<p>50% of the time, the winning card will be red. In these cases, the blue card which is flipped can tell you nothing about whether you picked the winning red card or the losing red card. Both red cards will be at 50% and the unflipped blue card will be 0%.<br />
50% of the time, the winning card will be blue. In these cases, the unflipped blue card will be at 100% and the red cards 0% each.</p>
<p>On average,</p>
<p>P(Red chosen) = (50% x 50%) + (50% x 0%) = 25%<br />
P(Red unchosen) = (50% x 50%) + (50% x 0%) = 25%<br />
P(Blue unflipped) = (50% x 0%) + (50% x 100%) = 50%</p>
<p>In other words, you double your return by switching to the remaining card of the same colour as the revealed card.</p>
<p>If you doubt this, give it a try and find out!</p>
Mon, 25 Jul 2011 08:26:15 +0000DogBoycomment 19534 at https://www.bgdf.com55cards wrote:I too think the
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem#comment-19462
<p><div class="quote-msg"><div class="quote-author"><em>55cards</em> wrote:</div>I too think the Monty Hall Problem is a fascinating little bit of maths, although because those maths are so fixed, I'm not sure how useful they are to the game designer (the relative wisdom of the player's choices will always be known and always be the same, hence the player will always behave in the same way).</p>
<p>Two things:</p>
<p>1. The numbers of your first example are slightly misleading, because you've rounded them up and down (as I'm sure you're aware). The expected return per game of switching is really $1.33, and the expected loss of not doing so is $16.66. Essentially the players either pay $50 to have a one-third chance of winning $100 (an obvious loss, and a choice no player would make) or pay $65 to have a two-thirds chances of winning $100 (a small but worthwhile return). But there's no game (I would say) since there's only one possible 'strategy' that doesn't result in an obvious and predictable loss for the player.</div></p>
<p>Yes, I was rounding. And I agree it's not a "game" outside of the way the term would be used for other games of chance (like Candy Land, or Chutes and Ladders). But it could be interesting in the context of another game. Especially if the answer is less certain than just 1 card you want, and 2 you don't want. Like if it was 1 card you want, and 1 you were indifferent about, and 1 that would be certainly negative to you. Though I think the 3 card scenario is far less interesting than the 4 card scenario for game design purposes - because while the player is stronger in switching - they have 2 choices to switch to. Making it still a gamble.</p>
<p><div class="quote-msg"><div class="quote-author"><em>55cards</em> wrote:</div>2. I'm confident that the analysis of your last example, where the doors are coloured in pairs, is incorrect. When the opponent flips over a card of the opposite colour, no new information about the position of the 'good' card is introduced. Since there is only one good card and two cards of each colour, the opponent can always choose to reveal a undesirable card of the opposite colour to the player's initial choice, and so by doing so reveals nothing new. The situation in this example is exactly the same then, as your second example, and the player should always switch and may pick a card of either colour when he does so. Provided he switches his chance of success goes from 25% to 37.5%, and this is regardless of the colour of the card he chooses. In a sense you can split the player's objective into two independent goals: trying to get the guess where the location of the money (initially 1 out of 4), and trying to guess the colour of money card (1 out of 2: red or blue). When the opponent reveals an undesirable card, the player can switch to improve his chances of guessing the location, but the chances that his initial guess about the colour were correct remain exactly the same, so switching makes no difference since it's 50:50 either way.</div></p>
<p>Well that's what I'm wondering...</p>
<p>If you look at it this way. There's a 50% chance that the card is a red backed card, and a 50% chance that the card is a blue backed card. I choose a red card. There's a 25% chance it's the card I chose. My opponent looks at the blue cards, and then flips over a "goat". That means the unflipped blue card has a 50% chance to be the "car" doesn't it?</p>
<p>Now, the unchosen red card is where (to me) it get's confusing. The card I picked doesn't have improved chances of being the "car" But the card I didn't pick previously now has a 1 in 3 chance of being the car..</p>
<p>I know its counter-intuitive... especially since the percentages all add up. But the other red card just can't really be 25% chance can it? I'm also wondering if the new chances would change to be 50% - blue card, 37.5% - unchosen red card, 12.5% - chosen red card. But it also doesn't make sense that the originally chosen card has a lower then 25% chance.</p>
<p>If I'm understanding your response correctly, you feel the 2 red cards have 25% chance, and the unchosen blue card has a 50% chance? Is that correct?</p>
Thu, 21 Jul 2011 14:30:24 +0000Yamahakocomment 19462 at https://www.bgdf.comAbout those coloured doors...
https://www.bgdf.com/forum/game-creation/design-theory/monty-hall-problem#comment-19461
<p>I too think the Monty Hall Problem is a fascinating little bit of maths, although because those maths are so fixed, I'm not sure how useful they are to the game designer (the relative wisdom of the player's choices will always be known and always be the same, hence the player will always behave in the same way).</p>
<p>Two things:</p>
<p>1. The numbers of your first example are slightly misleading, because you've rounded them up and down (as I'm sure you're aware). The expected return per game of switching is really $1.33, and the expected loss of not doing so is $16.66. Essentially the players either pay $50 to have a one-third chance of winning $100 (an obvious loss, and a choice no player would make) or pay $65 to have a two-thirds chances of winning $100 (a small but worthwhile return). But there's no game (I would say) since there's only one possible 'strategy' that doesn't result in an obvious and predictable loss for the player.</p>
<p>2. I'm confident that the analysis of your last example, where the doors are coloured in pairs, is incorrect. When the opponent flips over a card of the opposite colour, no new information about the position of the 'good' card is introduced. Since there is only one good card and two cards of each colour, the opponent can always choose to reveal a undesirable card of the opposite colour to the player's initial choice, and so by doing so reveals nothing new. The situation in this example is exactly the same then, as your second example, and the player should always switch and may pick a card of either colour when he does so. Provided he switches his chance of success goes from 25% to 37.5%, and this is regardless of the colour of the card he chooses. In a sense you can split the player's objective into two independent goals: trying to get the guess where the location of the money (initially 1 out of 4), and trying to guess the colour of money card (1 out of 2: red or blue). When the opponent reveals an undesirable card, the player can switch to improve his chances of guessing the location, but the chances that his initial guess about the colour were correct remain exactly the same, so switching makes no difference since it's 50:50 either way.</p>
Thu, 21 Jul 2011 14:13:34 +000055cardscomment 19461 at https://www.bgdf.com