Skip to Content
 

Math probability dilemma

44 replies [Last post]
Sgraffite
Offline
Joined: 12/31/1969

I just discovered these boards about a month ago but this is my first post! I've been toying with the design of a board game for about two months now but I'm having a problem finding some probabilities relating to cards. At first I thought this would be fairly simple to figure out by reading a few websites that covered collection probability, but it has me stumped. Here is my example:

40 card deck
6 cards with a diamond on them
5 card hand

What is the probability of having 2 diamonds in the 5 card hand? This is assuming no other players have drawn any cards from the deck.

Can someone explain in relatively simple terms how to find this? I've spent hours looking at websites that deal with probability to no avail. They always seem to use an example where you have to have certain cards in every slot of your hand, vice this example where 3 slots can be any card. Any help would be appreciated!

-Sgraffite

Sebastian
Offline
Joined: 07/27/2008
Re: Math probability dilemma

Sgraffite wrote:
I just discovered these boards about a month ago but this is my first post! I've been toying with the design of a board game for about two months now but I'm having a problem finding some probabilities relating to cards. At first I thought this would be fairly simple to figure out by reading a few websites that covered collection probability, but it has me stumped. Here is my example:

40 card deck
6 cards with a diamond on them
5 card hand

What is the probability of having 2 diamonds in the 5 card hand? This is assuming no other players have drawn any cards from the deck.

Can someone explain in relatively simple terms how to find this? I've spent hours looking at websites that deal with probability to no avail. They always seem to use an example where you have to have certain cards in every slot of your hand, vice this example where 3 slots can be any card. Any help would be appreciated!

The chance of having a two diamonds in a 5 card hand is:

6/40 (chance of having the first diamond)
x 5/39 (chance of having the second diamond)
x 34/38 (third card isn't a diamond)
x 33/37 (nor is the fourth card)
x 32/36 (nor the fifth card)
x 5x4x3x2x1 (they can be in any order)
/ 2x1 (but the diamonds are identical, so we don't care what order they're in)
/ 3x2x1 (and the non-diamonds are identical, so we don't care what order they're in either)

= 129254400 / 947531520
= 0.1364

So about a 1 in 7 chance.

I hope that this helps.

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

I'm sure there's some magic formula that lets you calculate all this in one brief stroke, but I'm not a math guy, so I simply write out all the possibilities. These are all the possibilities of drawing exactly two diamond cards in hand (D = diamond, N = not a diamond).

DDNNN
DNDNN
DNNDN
DNNND

NDDNN
NDNDN
NDNND

NNDDN
NNDND

NNNDD

You draw the cards one by one from the deck. The chances of the first card being a diamond is 6 in 40. The chances of the first card being a non-diamond is 34 in 40. Drawing a card, either a diamond or a non-diamond, alters the chances of the next card being a diamond or not, because there are now less cards of that type in the deck, and less cards overall in the deck. Therefore:

DDNNN = 6/40 * 5/39 * 34/38 * 33/37 * 32/36 = 1.36%
DNDNN = 6/40 * 34/39 * 5/38 * 33/37 * 32/36 = 1.36%
DNNDN = 6/40 * 34/39 * 33/38 * 5/37 * 32/36 = 1.36%
DNNND = 6/40 * 34/39 * 33/38 * 32/37 * 5/36 = 1.36%

NDDNN = 34/40 * 6/39 * 5/38 * 33/37 * 32/36 = 1.36%
NDNDN = 34/40 * 6/39 * 33/38 * 5/37 * 32/36 = 1.36%
NDNND = 34/40 * 6/39 * 33/38 * 32/37 * 5/36 = 1.36%

NNDDN = 34/40 * 33/39 * 6/38 * 5/37 * 32/36 = 1.36%
NNDND = 34/40 * 33/39 * 6/38 * 32/37 * 5/36 = 1.36%

NNNDD = 34/40 * 33/39 * 32/38 * 6/37 * 5/36 = 1.36%

Perhaps a bit surprisingly, the percentages don't change for each permutation (well, it surprised me!). Anyway, this adds up to a total chance of 13.6%, or about 1 in seven that a 5 card hand will contain exactly 2 diamond cards.

Now my question to you is: what are you going to do with this information? How is knowing the exact percentage of drawing 2 diamonds in a starting hand useful to you?

Johan
Johan's picture
Offline
Joined: 10/05/2008
Math probability dilemma

zaiga wrote:
Perhaps a bit surprisingly, the percentages don't change for each permutation (well, it surprised me!). Anyway, this adds up to a total chance of 13.6%, or about 1 in seven that a 5 card hand will contain exactly 2 diamond cards.

This is not so surprising since it is the same calculation that is performed each time.

D = Number of diamonds
N = Number of non diamonds
T = Total number of cards

Chance = (D * (D-1) * N * (N-1) * (N-2)) / (T * (T-1) * (T-2) * (T-3) * (T-4))

In what order you multiply does not matter.

// Johan

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

Johan, you are right, of course, and now it makes sense to me. I just didn't realize it when I spelled out all the possibilities!

You know, the nice thing about mathematics is that it always makes sense, but it is not always intuitive.

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

zaiga wrote:
You know, the nice thing about mathematics is that it always makes sense, but it is not always intuitive.

Hmmm still not sure 0.999999...forever = 1 makes sense.. well it does but... eh!

Pt314
Offline
Joined: 12/31/1969
Math probability dilemma

mawibse wrote:

Hmmm still not sure 0.999999...forever = 1 makes sense.. well it does but... eh!

Infinity can be perplexing, thats why I had to be so careful in my Foundations of Analysis class. Keeping track of what is legal and what isn't when working with infinite sequences.

Here's a simple proof for your example though.
x = 0.999999...
10x = 9.999999... multiply by 10
10x = 9 + x subtract x from both sides
9x = 9 divide by 9
x = 1 QED

Sebastian
Offline
Joined: 07/27/2008
Math probability dilemma

zaiga wrote:
I'm sure there's some magic formula that lets you calculate all this in one brief stroke, but I'm not a math guy, so I simply write out all the possibilities.

It's the one I used above - you've got 5x4x3x2x1 possible orderings, but you don't care what order the diamonds are in, so you divide by 2x1, and you don't care what order the non-diamonds are in, so you divide by 3x2x1.

That gives you 5x4x3x2x1 / 2x1x3x2x1 = 10.

sedjtroll
sedjtroll's picture
Offline
Joined: 07/21/2008
Math probability dilemma

Pt314 wrote:
Here's a simple proof for your example though.
x = 0.999999...
10x = 9.999999... multiply by 10
10x = 9 + x subtract x from both sides
9x = 9 divide by 9
x = 1 QED

Before I diverge off topic a little, I agree with the sentiment... how is this useful?

Also, did you WANT the probability of drawing EXACTLY 2 Diamonds, or did you want the probability of drawing AT LEAST 2 Diamonds. It's easier to draw at least 2 than exactly 2...

Now, speaking of math proofs:

Assume that A=B
Multiply both sides by A
A^2 = AB
Now subtract B^2 from each side
A^2 - B^2 = AB - B^2
Factor both sides
(A+B)(A-B) = (A-B)B
(A-B) cancels out
(A+B) = B
Substitute A for B, since they are equal
2A = A
Finally, divide by A...
2 = 1

- Seth

[EDIT: Correct mistake... I had said that (A^2 + B^2) = (A+B)(A-B)... oops!]

Sgraffite
Offline
Joined: 12/31/1969
Math probability dilemma

Thank you guys for the quick responses!

Quote:
Also, did you WANT the probability of drawing EXACTLY 2 Diamonds, or did you want the probability of drawing AT LEAST 2 Diamonds.

Either was fine really as it was just for me to learn from. With the steps that Sebastian posted I'm pretty sure I can find any probability for my game that I would need to. This will be a great help in balancing, thanks again guys :)

-Sgraffite

Oracle
Offline
Joined: 06/22/2010
Math probability dilemma

sedjtroll wrote:
Now, speaking of math proofs:

Assume that A=B
Multiply both sides by A
A^2 = AB
Now add B^2 to each side
A^2 + B^2 = AB + B^2
Factor both sides
(A+B)(A-B) = (A+B)B
(A+B) cancels out
(A-B) = B
Add B to both sides
A = 2B
Substitute A for B, since they are equal
A = 2A
Finally, devide by A...
1 = 2

- Seth

Uh... A^2 + B^2 is not (A+B)(A-B)

As for the 0.99999... = 1, this is an identity. There are no tricks in proving it, the values are equal. Pt314's proof is perfectly valid. Another is to sum the infinte series 0.9 + 0.09 + 0.009....

sedjtroll
sedjtroll's picture
Offline
Joined: 07/21/2008
Math probability dilemma

Oracle wrote:
sedjtroll wrote:
Assume that A=B
Multiply both sides by A
A^2 = AB
Now SUBTRACT B^2 from each side
A^2 - B^2 = AB - B^2
Factor both sides
(A+B)(A-B) = (A-B)B
(A-B) cancels out
(A+B) = B
Substitute A for B, since they are equal
2A = A
Finally, divide by A...
2 = 1

Uh... A^2 + B^2 is not (A+B)(A-B)
Of course you are correct... I made a mistake. I knew something was wrong with what I wrote. The corrected version is above...

GeminiWeb
Offline
Joined: 07/31/2008
Math probability dilemma

sedjtroll wrote:
Assume that A=B
Multiply both sides by A
A^2 = AB
Now SUBTRACT B^2 from each side
A^2 - B^2 = AB - B^2
Factor both sides
(A+B)(A-B) = (A-B)B
(A-B) cancels out(A+B) = B
Substitute A for B, since they are equal
2A = A
Finally, divide by A...
2 = 1

Nice try but you can't say (A-B) cancels out as you are dividing by zero (A=B).

:)

Xaqery
Xaqery's picture
Offline
Joined: 07/26/2008
Math probability dilemma

Ok my turn, (at the risk of making enemies of the female board game designers)

Proof that girls are evil:

First recognize that Girls require Time and Money
Girls = Time X Money

And as we know “Time is Money”:
Time = Money

So:
Girls = Money X Money = Money^2

And Money is the root of all evil
Money = SQRT(Evil)

Girls = (SQRT(Evil))^2

Girls = Evil

Girls are Evil.

- Dwight

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

Since this thread already is totally OT by now, my bad please forgive me, I feel that the classic non intuitive probability problem should be presented:

You are a contestant in a TV show, the host presents you with three doors and behind one door is a prize, the other two reveals a loss.
After you have "randomly" (you don’t know what’s behind the doors) chosen one door the host, to spice things up, without opening the door you first picked reveals one of the doors that does NOT contain a prize. (he knows what’s behind all doors)
You now have two doors to choose from, the one you first picked and the other non opened door.

What should you do to increase your chances of winning?
1) Stay with the door you first choose
2) Change to the door you have not chosen
3) Doesn’t matter, it’s fifty-fifty (two doors) anyways.

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

Ah, the Monty Hall problem. It's a classic. I was indeed thinking of that problem when I said that mathematics (strange, I almost typed "magic" there, hmmm) always makes sense, when you think about it, but it's not always intuitive, and this is a good example of it.

The Monty Hall problem makes me think of another mathematical case involving TV shows and hosts. It is of special interest to game designers, because it has to do with risk taking / risk avoiding and how many people generally tend to avoid risk.

You are in a TV show and after a lot of difficult questions, defeating a lot of contestants you have won $20.000. The host now offers you the choice of trying to win even more. Again, there are three doors and if you pick the right one you win $100.000 instead of $20.000. However, if you pick the wrong one, you'll lose everything you have won and go home empty handed. However, you don't have to take this gamble, you may also choose to simply walk away with $20.000. Do you take this gamble or not?

Mathematically speaking, you should always take the gamble. It gives you a 1 out of 3 chance to win 5 times as much. However, in real life, you will see many contestants taking the $20.000. They will reason that they have come so far and don't want to risk losing a lot of money, even though they can win even more. For them $20.000 is good enough. This is a good example of risk avoiding behaviour, and you will see it happening in board games too.

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

"Do you take this gamble or not?"
Is not a valid mathematical question given your data, you must specify what you want to know so that you can mathematically use the information provided to make an odds calculation:

Better questions given your data would be:
Should you gamble to have the highest chance to go home with 100.000?
Or
Should you gamble to have the highest chance to go home with at least 20.000?
Or
What would have the highest odds of leaving you with most money after 1000 tries be; gamble or not gamble.
And so on…

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

mawibse wrote:
"Do you take this gamble or not?"
Is not a valid mathematical question given your data, you must specify what you want to know so that you can mathematically use the information provided to make an odds calculation:

Mawibse, the whole point is that it is not a mathematical question, it is a psychological question. Mathematically the choice is clear: you should take the gamble. You have a chance of 1 out of 3 to win 5 times as much. If there were a roulette table with such odds I would be standing at it all day.

The questions you are asking are the questions that you would have to ask yourself when you are in such a position. Do you want to have a 1 in 3 chance at $100.000? Or are you satisfied with $20.000? I mean, how often do you get the chance to win $20.000? You already defied the odds by getting so far, why risk losing it all?

Xaqery
Xaqery's picture
Offline
Joined: 07/26/2008
Math probability dilemma

ha! We (5 or 6 software engineers) spent an entire evening once at a party passionatly arguring this question. I will let others have that fun and not reveal the logic.

- Dwight

Xaqery
Xaqery's picture
Offline
Joined: 07/26/2008
Math probability dilemma

The Monty Hall problem that is. :P

- Dwight

lar
Offline
Joined: 12/31/1969
Math probability dilemma

zaiga wrote:
Mathematically speaking, you should always take the gamble. It gives you a 1 out of 3 chance to win 5 times as much. However, in real life, you will see many contestants taking the $20.000. They will reason that they have come so far and don't want to risk losing a lot of money, even though they can win even more. For them $20.000 is good enough. This is a good example of risk avoiding behaviour, and you will see it happening in board games too.

I disagree with your thoughts on this one. You shouldn't always take the gamble, the reason why is it only works out if you can repeat the risk 3 times or more. In that way, over time you will make more money. Granted, there is a chance that you will never win the 100K regardless of number of chances you have, but if you play the odds enough, it's in your favor to go for it. If you don't have the chance to take the same risk 3 times then the odds are against you.

In a board game I don't know if I like those odds - especially if it causes you to loose the game, because having 5 times as much in one game isn't the same as saying you have 100K in the span of 3 games, you simply have it once and kicks butt, otherwise, you are playing the game from a loosing position in the other 2 games, and we all know - 2 out of 3 wins, not the sum of the 3 games ... unless it yahtzee ...

Challengers
Offline
Joined: 12/31/1969
Math probability dilemma

For those who care, here is a web page discussing the Monty Hall problem.

http://www.math.toronto.edu/mathnet/games/montymath.html

This was a topic I covered for my home-school kids, as a branch of a course on Logic and Critical Thinking. I couldn't find the link that discussed Marilyn von Savant's article. Apparently, she was lambasted by engineers who "discussed it all night" (smile) and concluded that she was wrong to state that the contestant should always switch.

The imporatnce of clearly stating the parameters of a puzzle is pointed out in one particular letter to Marilyn von Savant:

http://www.wiskit.com/marilyn/gameshow.html

While you're thinking these deep thoughts ... If a tree falls in the forest ...

Mitch

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

zaiga wrote:
Mawibse, the whole point is that it is not a mathematical question, it is a psychological question. Mathematically the choice is clear: you should take the gamble. You have a chance of 1 out of 3 to win 5 times as much.

My point was that you CANT say whats "mathematically clear" if it is not a mathematical question.

It's like saying:
There are 3 strawberries in a jar, for every strawberry you eat you get 1.5% more chance of getting stomach ache, do you eat the strawberries?
Mathematically the choice is clear: You should not eat the strawberries because they increase the chance of getting stomach ache!

... this is incorrect, you cant apply math to a psychological question. Just because you have some odds and numbers in a question doesnt make it possible to "mathematically" say whats "clear" or not. If the question was; Should you eat the berries to decrease the chance of stomach ache? Then you'd have a valid Mathematical question.

Clearly defined the Monty hall dilemma is not so much "deep" as it is one of understanding. Just like the "0.9999..forever = 1" and the fact that the more precise you messure a subatomic particles speed the less it has an exact position in space and vice versa.

Most "deep" problems arise from the same define principle, let's take the "If a tree falls" example; If you clearly define what each word means, in the "If a tree" example "sound" is most important, then it's not that difficult to answer.

But if youre into "deep" answer this:
What choices you make depends on who you are.
Who you are is defined by the past.
You have no control over the past therefor no control over who you are and thus no control over any choices you make.
Where does "free will" come in?

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

mawibse wrote:
My point was that you CANT say whats "mathematically clear" if it is not a mathematical question.

OK, what I was assuming was that you would normally want to maximize the chance to win as much money as possible. Is this question mathematical enough for you?

In that case 33,3% x $100.000 > 100% x $20.000

Again, my point was that that may not be the premise the contestant is basing his decision on. His premise might be "maximize the chance to win at least $20.000" in which case he would not take the gamble. Or his premise might be "maximize the chance to win $100.000" in which case he would take the gamble.

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

zaiga wrote:
I was assuming was that you would normally want to maximize the chance to win as much money as possible. Is this question mathematical enough for you?

With a lottery ticket that costs 100$ with a 1 in a billion chance to win 101$ you would still be maximizing the chance to win as much money as possible by gambling.
But is normally something I assume is not what you want to do.

zaiga wrote:
In that case 33,3% x $100.000 > 100% x $20.000

1% x $2.000.001 is also > 100% x $20.000
You’d have no problem having 99% chance of loosing $20.000 to have a go at 2 million one dollars right?
You shouldn’t, IF you are allowed try a hundred times that is, because statistically you would have gained 1 dollar for every 100th time you’d gamble.
To bad that most TV Shows only gives you one chance.

zaiga wrote:
Again, my point was that that may not be the premise the contestant is basing his decision on. His premise might be "maximize the chance to win at least $20.000" in which case he would not take the gamble. Or his premise might be "maximize the chance to win $100.000" in which case he would take the gamble.

Both are perfectly mathematically valid, the second premise being identical to"maximize the chance to win as much money as possible." since $100.000 is as much money you could possibly win in your scenario.

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

mawibse wrote:
With a lottery ticket that costs 100$ with a 1 in a billion chance to win 101$ you would still be maximizing the chance to win as much money as possible by gambling.
But is normally something I assume is not what you want to do.

No, the best bet in this case is to not buy a lottery ticket at all.

Quote:
1% x $2.000.001 is also > 100% x $20.000
You’d have no problem having 99% chance of loosing $20.000 to have a go at 2 million one dollars right? You shouldn’t, IF you are allowed try a hundred times that is, because statistically you would have gained 1 dollar for every 100th time you’d gamble. To bad that most TV Shows only gives you one chance.

Whether someone would take this gamble, would depend on how much risk that person is willing to take and how much that person values $20.000 and $2.000.001. Just because it is a little over hundred times as much, doesn't mean a person also values it ten times as much. Could be more, could be less. Mathematically, however, it's a good bet.

The fact that you only have one chance is mathematically irrelevant. You could try 200 times and still not win the two million one dollar, in which case you would have lost four million dollar. Or you could win it ten times in a row.

Of course, gambling is not so much about the maths as it is about psychology. People have no qualms about spending $25 for a lottery ticket that gives them a chance of 1 in a 100 million to win $10 million dollars, but they wouldn't go to a casino and put $25 on a single number at the roulette table and leave there until the total winnings have accumulated over $10 million, even though the odds of that happening are much better.

Better yet is to not gamble at all, because that gives you a 100% chance of keeping your money!

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

jwarrend wrote:
This is somewhat related, I think, to the gaming phenomenon of "playing for position", in which a player who is in 2nd place can make a move that will either result in him winning the game, or coming in last place. Faced with this choice, many players will not make the move, since they'd rather come in 2nd than risk coming in last. This is flawed playing; the point of playing a game is to try to win, and 2nd and last place are equally "negative" results, but for some reason 2nd place feels better. I can understand the mindset a bit, I guess, but it's definitely yet another problem designers have to contend with, and I'm not sure how you can take irrational play into consideration when trying to design a working game.

Jeff, I wouldn't call this "flawed playing". It's just a typical example of risk avoiding behaviour, which may or may not be jusitified. Also, the point of playing a game may not always be trying to win. Different people may have different objectives, and they may not always be 100% logical or rational. Some people just want to try and not come in last place, some people want to try that odd ball strategy just to see what happens, etc. If your design is robust, it will be able to cope with all these kinds of "irrational" play.

However, I do think that a design does not have to be able to cope with intentional destructive behaviour. Sometimes players will just take certain actions to wreck the game to prove that it is broken. For example, a group of Puerto Rico players who never pick Builder, Mayor or Captain, just because that way they can say it is broken because the game never ends. That would just be silly.

And then there's the big gray area between "irrational" and "intentional destructive" which we could debate forever, but I think the point is clear ;)

Zomulgustar
Offline
Joined: 07/31/2008
Math probability dilemma

Because I hate to see such a nasty little puzzle drift out of the discussion, I feel the urge to point out that I think people may be confabulating several 'puzzles' here. The 'Monty Hall Problem' appears at the end of the first page, and zaiga's post at the top of page 2 is just remarking that utility is not necessarily linear with expected winnings.

To recap, the MHP is:

There are three doors. You know that one of these contains a prize, but the other two contain nothing. You arbitrarily select a door, and Monty (whether you have chosen correctly or not!) opens a different door, which does not have a prize. He then gives you the option to switch to the other unopened door, or stay on the one you originally chose. How do you maximize your chance of picking the door with the prize, and why?

Psychology only comes in when the parameters of the problem aren't trustworthy...how do we know there's a prize back there at all, or can it be moved with out anyone knowing? What if he's allowed to decide what 'gimmick' he'll use after seeing whether the player picked the door with the prize?

doho123
doho123's picture
Offline
Joined: 07/21/2008
Math probability dilemma

Quote:
How do you maximize your chance of picking the door with the prize, and why?

If it hasn't been covered before, you should always switch. The easiest way to describe this is that your original pick is not really a pick, but the host is letting you divide the three doors into two groups, one with a single door (your pick) and one with two doors (the two doors you did not pick).

The single door group gives you a 1 in 3 chance on winning. The two door group has a 2 in 3 chance of winning. Since there is always at least one ZONK! door in the 2 door group (and the host knowing which door it is), having the host show it to the player or not is irrelevant to the math.

jwarrend
Offline
Joined: 08/03/2008
Math probability dilemma

Quote:

Jeff, I wouldn't call this "flawed playing". It's just a typical example of risk avoiding behaviour, which may or may not be jusitified. Also, the point of playing a game may not always be trying to win.

I disagree; the reason that a player engages in the playing of a game may not strictly be to win (he may want social interaction, etc), but the point of a strategy game's mechanics and scoring systems is to coronate a winner. With the exception of party games or cooperative games, the objective of the players playing the game is to attempt to win the game.

I agree that it's a consequence of risk avoidance; people look at how much they stand to lose by making a risky play, rather than how much they stand to gain. But, if your only chance to win the game is to take the risk, you have to take it!

Quote:

Different people may have different objectives, and they may not always be 100% logical or rational. Some people just want to try and not come in last place, some people want to try that odd ball strategy just to see what happens, etc. If your design is robust, it will be able to cope with all these kinds of "irrational" play.

Yes and no. Let's say a player is presented with 2 choices: move A with a success probability of 75% which will place him in 2nd place, or move B with a probability of 40% which would win him the game. Many players would take move A, but the problem with this is that it can lead to a kingmaking situation, as it can ruin the 2nd place player's attempt to win the game. I recognize there are other problems bundled in here; kingmaking, leader identification, etc. My point is simply to question whether a design should be able to accomodate flawed play.

My answer is that given that many players do play this way, it probably should, but this can be somewhat unsatisfying when trying to design a robust strategy game. It's hard enough to design a game that is exceptional and balanced when played by serious players; how much moreso to now throw in players who may not make wise choices?

I do agree that a good design should handle oddball strategies.

Quote:

However, I do think that a design does not have to be able to cope with intentional destructive behaviour.

This I completely agree with. I give relatively little concern to cheating when designing, except for "accidental cheating".

-Jeff

Pt314
Offline
Joined: 12/31/1969
Math probability dilemma

mawibse wrote:
Since this thread already is totally OT by now, my bad please forgive me, I feel that the classic non intuitive probability problem should be presented:

...

What should you do to increase your chances of winning?
1) Stay with the door you first choose
2) Change to the door you have not chosen
3) Doesn’t matter, it’s fifty-fifty (two doors) anyways.

I just love trying to explain that you have a 2/3 chance of winning if you flip. If they don't get it, I increase the number of doors.

Suppose there are 100 doors, and the host knows which one has the prize. You pick one, and the host opens 98 other doors and shows that they are empty. Do you stick with the door you first picked? Or pick the door left over from the other 99? Of course you would be better off switching. You can only lose if you picked the right door at first, which has a 1% chance of happening.

Hmm, it seems that a lot of the time a combinatorics/probability topic is started here it morphs into game theory, psychology, then into philosophy. Well, at least when we go off topic in this forum it doesn't end up in politics & religion, and thus a flame war.

Syndicate content


forum | by Dr. Radut