Skip to Content
 

Math probability dilemma

44 replies [Last post]
zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

Infernal wrote:
What this means is that whatever choice you make now (to switch or not) has the same probability. When you first chose the door you only had the 1% chance. Now if you switch you have the 2/3 chance but the choice of keeping the door also has the 2/3 chance (is this right?). I just imagine that it would be just like reselecting the door after the host shows what is behind certain unselected doors.

No, the other door has a 99% chance of being the correct door. The door you have initially chosen has a 1% chance of being the correct door. This makes it very obvious that you should always switch.

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

An example makes this very clear:

There are 100 doors. You choose door #100.

If one of the doors #1 to #99 is the correct door the following is true:
You switch -> you win
You do not switch -> you lose

If door #100 is the correct door the following is true:
You switch -> you lose
You do not switch -> you win

So, in 99% of the cases you should switch, in %1 of the cases you should not switch. All things else being equal, you should always switch to have the best shot at choosing the correct door.

Infernal
Offline
Joined: 12/31/1969
Math probability dilemma

But the context has changed. There is now only 2 doors, and it must be one of them. In the context of all the doors then yes, there is only a 1% chance. But in the context of the reduced doors it only has a limited chance of being one of the 2 doors.

It seems to me a bit like flipping a coin. Just because I have fliped 5 heads in a row, it does not increase the chance of me fliping a tainls next flip. I still have just a 50% chance.

OutsideLime
Offline
Joined: 12/31/1969
Math probability dilemma

Quote:
There are 100 doors. You choose door #100.

If one of the doors #1 to #99 is the correct door the following is true:
You switch -> you win
You do not switch -> you lose

I don't understand. If I switch, might I not switch to one of the 98 remaining doors that does not contain a prize, and therefore lose?

~Josh

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

OutsideLime wrote:
Quote:
There are 100 doors. You choose door #100.

If one of the doors #1 to #99 is the correct door the following is true:
You switch -> you win
You do not switch -> you lose

I don't understand. If I switch, might I not switch to one of the 98 remaining doors that does not contain a prize, and therefore lose?

~Josh

I'm sorry I wasn't clear. The host opens all doors, except the one you have chosen and the door that contains the prize, or, in case you choose the door with the prize, all doors except the one you have chosen and a random other door. Then you get the option to switch doors or not.

zaiga
Offline
Joined: 12/31/1969
Math probability dilemma

Infernal wrote:
But the context has changed. There is now only 2 doors, and it must be one of them. In the context of all the doors then yes, there is only a 1% chance. But in the context of the reduced doors it only has a limited chance of being one of the 2 doors.

It seems to me a bit like flipping a coin. Just because I have fliped 5 heads in a row, it does not increase the chance of me fliping a tainls next flip. I still have just a 50% chance.

That's not correct. Each coin flip is independ of any previous flip. This has nothing to do with the Monty Hall problem.

When you initially choose a door it has a %1 of being the correct door. This is true for all doors. When the host opens all doors, except the one you have chosen and another door, the chances of the door you have chosen being the correct door haven't changed. However, the chances of the other door being the correct one have suddenly increased dramatically!

If you can't wrap your head around this, just try it out in real life, with 3 cards. First try the "always switch" strategy twenty times or so, and then try the "never switch" strategy, and see if you note a difference between both strategies. If you don't notice a significant difference, try it with 10 cards, and repeat the experiment.

Pt314
Offline
Joined: 12/31/1969
Math probability dilemma

Zaigas/My example on the doors problem I feel puts it in perspective. You should switch because the probability that your first choice is the right one is less then the probability that another one is the right one.

The host that knows which door is right opening all of the doors left except for one, just groups all of the rest of the doors into one door. Of course you should switch.

On the subject on free will/determinism, heres a thought exercise.
Suppose free will doesn't exist, everything that happens is already determined by the past. Suppose that this makes it possible to build a computer which scans your brain and predicts what you are going to do in a certain situation with 100% accuracy.

You are then placed in a room with two boxes, and you are allowed to pick one of the boxes, or both of the boxes. One of the boxes is guaranteed to have $100, the other box may have $1,000,000, or nothing.
The computer(which has already scanned your brain) puts $1,000,000 in the second box, only if it predicts that you are going to pick only that box.

So what would you do? If the computer wasn't accurate 100% of the time, but 90% of the time, or some other percentage, how would this effect your decision?

One way of looking at this problem is that since the computer has already either put the money in the box, or not. You wil always be better off picking both boxes. $100 > $0 if there is no money in the one box, $1,000,100 > $1,000,000 if there is.

On the other hand, if the 100% accurate in knowing what you are going to do, then you should pick just the one box, because if you pick both you will only get $100 ($1,000,000 > $100).

Oracle
Offline
Joined: 06/22/2010
Math probability dilemma

Infernal wrote:
But the context has changed. There is now only 2 doors, and it must be one of them. In the context of all the doors then yes, there is only a 1% chance. But in the context of the reduced doors it only has a limited chance of being one of the 2 doors.

It seems to me a bit like flipping a coin. Just because I have fliped 5 heads in a row, it does not increase the chance of me fliping a tainls next flip. I still have just a 50% chance.

Look at it this way. There are 100 doors, a prize behind one and nothing behind the other 99. You choose 1, and then are given the choice to keep that one or switch to all 99 others.

Effectively this is what your choice is when 98 of the doors are shown to have no prize and you may then switch to the 99th or keep your original choice.

Sgraffite
Offline
Joined: 12/31/1969
Math probability dilemma

OMG, what happened to my thread?!?

TrekNoid
Offline
Joined: 10/02/2009
Math probability dilemma

jwarrend wrote:
I disagree; the reason that a player engages in the playing of a game may not strictly be to win (he may want social interaction, etc), but the point of a strategy game's mechanics and scoring systems is to coronate a winner. With the exception of party games or cooperative games, the objective of the players playing the game is to attempt to win the game.

There's also the short-term vs long-term win situation...

When you're playing a single game, I would agree that the goal is to win... But if you're playing a qualifying game for a later game, playing for 2nd is often a legitimate attempt to win ultimately.

ie, in a poker tournament, I may play for second during the qualifying rounds, if the top two players from each qualifying round automatically advance... especially if playing for 1st might increase my likelihood of not qualifying.

Also, when monetary prizes are on the line, I might play for second, rather than risk not winning any money at all.

TrekNoid

Triktrak
Offline
Joined: 12/31/1969
Math probability dilemma

My initial thought was: You have a 1 in 100 chance of picking the right door to begin with, fine. Now all the other doors except the one you picked and one other door. Sticking with the one you picked originally is the same as "choosing" that door, there is still a 50/ 50 chance of getting it.

My second thought was: So, out of all those other doors Monty just left that one mystery door. Unless you got extremely lucky the first time, he is naturally not going to open the other 98 doors and expose the prize. The odds are 99 in 100 that he has just left the prize door.

Very deceptive at first, very interesting. Has it been used in a boardgame?

Triktrak
Offline
Joined: 12/31/1969
Math probability dilemma

This thread, triggered a vague memory of a paradox that I heard of while watching PBS or something educational. I hope someone can point me to the name of this paradox. I'm sure it could be useful in game design. A certain number of camera crews had to film a certain number of scenes in a movie. There was an optimal number, which was lowere than you think, but two few crews, were innefficient. However, when too many camera crews were added it actually became a slower process as well.

mawibse
Offline
Joined: 12/31/1969
Math probability dilemma

Triktrak wrote:
Very deceptive at first, very interesting. Has it been used in a boardgame?

Don't think so, there would be no point once all understood it.

Sometimes it's difficult to separate what we make up and whats real, I'll give you an example:

There was three board game designers making some dice game prototypes giving the publisher 10 dices each for him to play with (total 30 dices), it turns out the publisher only needed 25 dices but since he couldnt distribute the 5 dices he wanted to give back evenly, the publisher dicided to keep 2 for himself and only give back 3 so the board game designers could get one dice each back.

Since the board game designers had first given the publisher 10 dices each but got one back they had effectivly only given 9 dices each (10-1=9) to the publisher, or a total of 27dices if you will. (9*3=27)

Making the total sum of dices the publisher had; 9*3=27 plus the 2 dices the publisher kept = 29 dices...(27+2=29)

Hmmm 27 dices given and 2 kept adds up to 29 not 30!!
Where da heck did the last dice go? *looks around on floor*

Oracle
Offline
Joined: 06/22/2010
Math probability dilemma

mawibse wrote:

Since the board game designers had first given the publisher 10 dices each but got one back they had effectivly only given 9 dices each (10-1=9) to the publisher, or a total of 27dices if you will. (9*3=27)

Making the total sum of dices the publisher had; 9*3=27 plus the 2 dices the publisher kept = 29 dices...(27+2=29)

Hmmm 27 dices given and 2 kept adds up to 29 not 30!!
Where da heck did the last dice go? *looks around on floor*

I hate this "riddle".

It makes no sense to add 27 dice "given" and 2 dice "kept". These belong on opposite sides of the equation, like:

27 dice given = 25 dice used by the publisher + 2 dice kept by him.

Syndicate content


forum | by Dr. Radut