Sorry to interrupt...this will only take a moment.
This site is an independent reader-supported project.
Because you have viewed at least a few articles now...
Can you give a small donation to keep us online?
We can give you e-books and audiobooks and stuff.
This site is an independent reader-supported project.
The cost of keeping it running are considerable.
If you can spare a few dollars it would help us enormously.
We can give you e-books and audiobooks and stuff.
×
×
Experimental Feature

Select 'Atmospheric Audio' from the Audio menu to add subtle background audio to certain portions of the article.

Let's Make a Deal

Article #55 • Written by Alan Bellows

There is a classic mathematical nuisance known as the Monty Hall problem which can be hard to wrap the mind around. It is named after the classic game show "Let's Make a Deal," where a contestant was allowed to choose one of three doors, knowing that a valuable prize waited behind one, and worthless prizes behind the others.

On the show, once the contestant made their choice, Monty Hall (the host) opened one of the other doors, revealing one of the worthless prizes. He would then open the contestant's chosen door to reveal whether they picked correctly. The Monty Hall problem asks, what if the contestant were allowed to change her door choice after she saw the worthless prize? Would it be to her advantage to switch doors? In other words, if the contestant guesses that the new car lay behind door #1, and Monty opened door #2 to reveal a goat, is the new car more likely to be behind door #1, or door #3?

At this point, the imperfect wad of meat called the "brain" fires up it neurons, and usually informs its owner that revealing the contents of one of the other doors simply changed the contestant's odds from one-in-three to fifty-fifty. But that isn't the case. It has been mathematically proven that if the contestant were allowed to switch her door to #3 after seeing the goat behind #2, she'd be twice as likely to win.

How can this be? It isn't intuitive, but it's true. Great mathematicians have puzzled over this, as well as great scientists at Los Alamos and professors at MIT.

The best way to look at it is to imagine that when the contestant selects a door, she divides the doors into two sets: A) The doors she DID select, and B) the doors she did NOT select. At this point, each individual door has a one-in-three chance of being the winning door, but the two sets have differing odds... Set A has a one-in-three chance of containing the new car, but Set B, having twice as many doors in it, is twice as likely to contain the winner.

When Monty opens one of the doors in Set B to show it isn't the winner, Set B still has a two-in-three chance of holding the winner. The only difference is that there is only one door with unknown contents, so the 2/3 odds go to the unopened door in Set B, while Set A still has it's 1/3 odds. So revealing the contents of door #2 didn't make the contestant's odds any worse, but it did make the odds for door #3 improve.

In explaining the effect, it helps to increase the scale of the question. Imagine that there are 100 doors to choose from instead of three. The contestant chooses a door, and then the host opens 98 other doors to show that they don't contain prizes. Which is more likely to hold the prize... the door she selected initially, or the one door left unopened from the 99 she didn't choose? The answer is much more obvious: the door she chose still has a one-in-a-hundred chance of being the winning door, where the other closed door has a 99/100 chance.

The problem is that the human brain is hard-wired to seek out patterns, discarding much of the non-patterned data. This system usually works very well in keeping unimportant information from overwhelming the mind, but occasionally too much information ends up on the cutting-room floor. There is another problem called the Gambler's Folly which also illustrates the mind's lackluster performance in guaging probability: Imagine that you flip an ordinary coin 99 times, and amazingly, it comes up heads every time. What are the odds that it will come up heads again on the 100th flip? Most people would say that it's a very unlikely possibility, but it turns out that the odds are exactly 50% (overlooking the negligible chance that the coin will land on its edge).

The reason we think otherwise is because our pattern-oriented brains see the 100-flip scenario as extremely rare, which it is... it has a one in 1,267,650,600,228,229,401,496,703,205,376 chance of happening. But if you sit down and write out any random sequence of heads and tails, it has the exact same odds of appearing as does 100 heads. The typical human brain just doesn't assign random sequences the same significance as clear patterns--such as 100 heads on a coin flip--so the importance of the pattern is artificially inflated.

The human brain is a great pattern recognition engine, but sometimes that causes it to overlook the subtleties of numbers.

Article written by Alan Bellows, published on 22 November 2005. Alan is the founder/designer/head writer/managing editor of Damn Interesting.

Article design and artwork by Alan Bellows.
SHARE

More Information
Related Articles


65 Comments
rk_cr
Posted 22 November 2005 at 05:14 pm

There is a great version of this in "The Curious Incident of the Dog in Night-time" by Mark Haddon, that uses this trick to demonstrate how people often approach problems from the wrong perspective.


Rasmussen Mark
Posted 23 November 2005 at 04:09 pm

I don't buy it. The point of the Gambler's Folly is that the odds of a truly random event are not altered by previous outcomes. As stated in the article no matter how many coin flips come up heads the chances are still 50-50 that the next one will be heads too. Similarly no matter how many losing doors Monty opens, when there are only two left the odds are 50-50. The explanation of the Monty Hall effect above is flawed in that Monty's actions are not random when he opens the "losing" door(s). Thus the contestant's odds "reset" (so to speak) every time Monty opens a door he KNOWS is a loser. It is incorrect to carry forward odds that applied from previous conditions when calculating future random events when the conditions have changed in a non-random fashion.


Alan Bellows
Posted 23 November 2005 at 04:14 pm

Well, my ability to explain the effect may be in question, but the effect itself is very real, and very proven. Do some web searches on "monty hall problem" And you'll see lots of data about the subject.

It is VERY counter-intuitive, I'll give you that. But it's real.


foos
Posted 23 November 2005 at 05:59 pm

As someone who has taken graduate courses in probability (and has taught it), I can back up Alan Bellows here. In fact, his explanation is one of the clearest I've seen. To Mark Rasmussen and other doubters, it is very important to think about "independent events" such as coin tosses. The notion actually helps a great deal to explain what's going on here. Think of this: the door that Monty chooses to open is not an independent even at all. In fact, it's very much determined by the door that you open (if the door you choose doesn't have the prize, then Monty is only able to open the *only* remaining door that doesn't contain the prize).

Another way to look at it is to choose which 2/3 of the doors you want. Say you want whatever's behind doors 2 and 3. You can then make it your strategy to first select door 1 and then plan to move to 2&3. Since Monty will open either 2 or 3 (whichever doesn't contain a prize), you get to see both of the doors you choose. Hence, going into the problem, you gave yourself 2/3 odds.

Hope that helps!


foos
Posted 24 November 2005 at 11:38 am

I just reread what I wrote, and it actually says nothing beyond what Alan wrote. Very well written above!


MeasureMan
Posted 24 November 2005 at 08:10 pm

I tried the Monthy Hall Problem Simulator Link. I first played 10 times, switching doors each time and won 9 of 10. I then continued to 20 times, still switching doors, I finished with 16 out of 20 as wins. I'd put my money on those odds any day.


jokull
Posted 24 November 2005 at 08:56 pm

My moment of clarity: The door you picked does NOT have a 1/3 chance of being revealed to you at first. Instead either of the other two is revealed. IF all three doors have an equal chance of being revealed to you after your first pick you would THEN have a 50/50 chance (given you didn't pick the correct door right away).

Am I correct in my understanding?


rk_cr
Posted 27 November 2005 at 04:40 am

I think of it that you have a 1/3 chance of intially picking a "bad switch choice" and a 2/3 chance of picking a "good switch choice."

I also just played the Simulator game for 31 games and managed to win 20 of them...


Arcangel
Posted 04 December 2005 at 01:46 am

I didn't even bother to finish reading this article the first time I saw it but decided to revisit it over a week and a half later, read it through and then tried the simulator out. Boy was I wrong. I tried 20 games where I didn't switch doors and only won 8 times. I played another 20 games where I always switched doors and won 13 times. Guess it's a good thing I don't gamble.


ricm123
Posted 26 December 2005 at 11:31 pm

The simulator is flawed. The player NEVER selects the correct door on first choice. This should happen about 1 out of 3 times, yet does not.


Tapetum
Posted 05 January 2006 at 07:02 pm

The clearest explanation I've seen is an expanded example. Say I lay out a deck of cards, face down. I ask you to pick one out as the Ace of Spades. Instead of turning it over, we then start turning over other cards, one by one. We don't hit the Ace of Spades. When there are two cards left, and the Ace of Spades still hasn't shown, are you going to want to stick with your original 1:52 pick?


imiJn
Posted 11 February 2006 at 11:35 am

Rasmussen Mark is right. It has everything to do with the host's knowledge which enables him to choose a losing door. In the 100 door scenario, in the unlikely event that Monty RANDOMLY opens 98 doors before leaving two, the odds are, as in the gambler's folly, 50/50, the past events don't affect the outcome, but when Monty KNOWS which door not to open, the whole thing starts out with the guesser having one in a hundred chance of getting the prize, and Monty having a ninety-nine in a hundred chance of having one of his doors with the prize. Since he KNOWS which 98 losing doors to open first, that final door does indeed have the same 99% chance of being the door.

Just remember, if the whole thing were random, one in a hundred times the contestant would pick the correct door, but usually the game would end with Monty revealing the prize behind one of the other ninety-nine doors before opening 98 losing doors. The Wikipedia entry has it right when it states as a condition that: "The game host knows what is behind each door."


sleepyrohan
Posted 11 February 2006 at 12:45 pm

When i first read the article the question that bothered me was.... if the prize-winning door is pre-decided then no matter which door the contestant chooses..wether he switches or not..shloudn't the probability of winning always be 33%?
The simulation confused me even further so i made a simple c++ program in which first the prize winning door was chosen, then the door containing the goat was chosen and then contestants answer and his decision to switch or not was chosen by the computer, all randomly. It finally displayed the percentages of games won in case of switching and not switching after around 500 runs. I executed the program several times and in each case the result was :---
In case of switching: 49.2-51.6% winning
In case of not switching: 32.3-34.1% winning
(these are approximately the minimum and maximum percentages)
It is clear that the percentages tend to 1/2 and 1/3 probability, and this i can now appreciate.
However according to alan and the people agreeing with him the probability is supposed to double (i.e =2/3 instead of 1/2) in case of switching. I cannot understand the logic behind this and my calculations also disagree with this. Someone please explain.


vernonintx
Posted 12 February 2006 at 12:56 am

Sleepy - Your logic is flawed. That is why you only got roughly 50% when switching. The key, as was stated before is that your software has to not randomly pick the door but it has to knowingly pick the door that doesn't have the prize behind it. So the program needs to be broken down into these routines.

First, have the "contestant" pick a door.
Second, have the "host" pick the remaining door that isn't the winning door.
Third, have the "contestant" then decide to switch or not.

The first and third routines can be random, but the second routine only has a choice if the first routine selected the door with the prize.
If the contestant picked the door with the prize, he can select either door.
If the contestant has picked a door with no prize, he has no choice on the doors. He has to select the one without the prize.

Go back and rewrite the software to reflect this change, and I'll bet that you come up with the same results.

I did the simple version by playing the game on the link 60 times, 30 switching and 30 not switching. When I switched, I won 23 times. When I didn't I only won 11. Pretty close to the predicted results.


angrycrying
Posted 12 February 2006 at 01:21 am

Think "Minesweeper":

The best way I've been able to visualize this is to expand on the 100 doors concept in the article - by applying it to the Minesweeper game in Windows.

Imagine that you have a 10 by 10 minefield with 100 squares.

You're told that there's only one mine under all the squares.

You mark one square as the one you guess it's under. Your chances of getting it right are 1 in 100.

Next, you click an imaginary button that automatically uncovers all squares but two - the one you marked, and the one the mine is under.

Pretend, for instance, that the square you picked is near the top left corner, and the only other uncovered square is near the bottom right corner.

If you were given a chance to guess again, almost everyone I know would pick the other uncovered square near the bottom right corner - and they'd have a very good chance of being right.


mrdan
Posted 12 February 2006 at 02:02 am

This is how I view the problem:

When you first make your choice, you have a totally random 1/3 chance of being right.
Monty knows which one is the money and thus shows you one of the donkeys.
This means there are now 2 options left, one money and one donkey, thus you have a 1/2 chance either way.

The question is, whether to switch or not. Your original descision was based on a 1 in 3 chance, where as if you do switch, that is based on a 1 in 2 (50:50) chance.


angrycrying
Posted 12 February 2006 at 02:05 am

You can also see the Minesweeper visualization here, with screenshots:

http://www.angrycrying.com


angrycrying
Posted 12 February 2006 at 02:07 am

Too bad you can't edit posts - here's the Minesweeper visualization link again - this time one that works:

http://www.angrycrying.com/2006/02/monty-hall-problem-visualize-it-as.asp


sleepyrohan
Posted 12 February 2006 at 07:53 am

vernonintx said: "Sleepy - Your logic is flawed. That is why you only got roughly 50% when switching. The key, as was stated before is that your software has to not randomly pick the door but it has to knowingly pick the door that doesn't have the prize behind it. So the program needs to be broken down into these routines.

First, have the "contestant" pick a door.

Second, have the "host" pick the remaining door that isn't the winning door.

Third, have the "contestant" then decide to switch or not.

The first and third routines can be random, but the second routine only has a choice if the first routine selected the door with the prize.

If the contestant picked the door with the prize, he can select either door.

If the contestant has picked a door with no prize, he has no choice on the doors. He has to select the one without the prize.

Go back and rewrite the software to reflect this change, and I'll bet that you come up with the same results.

I did the simple version by playing the game on the link 60 times, 30 switching and 30 not switching. When I switched, I won 23 times. When I didn't I only won 11. Pretty close to the predicted results."

No, I dont think so.

My program runs exactly as u said..
In the first step, contestant is free to choose any door (ie prize winning or empty).Then when an empty door is to be shown by monty, my prog chooses 1 of the remaining 2 doors randomly and if its the prize winning door, it chucks that value and chooses the other remaining door (cuz its the only option then).
So you see, thats not the problem. Maybe the applet here is flawed.

Also angry's minesweeper visualization, Mrdan's view of the problem, Tapetum's card approach , iniJn's 100 door method, etc all say that switching would result in better chances, however no one says how better the chances will be. Can anyone please give a clear logic for the chances increasing to 2/3 instead of 1/2 and also tell me why i got those values with my program?


angrycrying
Posted 12 February 2006 at 04:54 pm

The Minesweeper analogy was simply to make the answer to the problem seem more intuitive.

Like you suggest, sleepyrohan, it doesn't explain the odds.

The best way I've seen the odds explained (sorry, I can't remember where - but I can't take credit) - is to show all the possible outcomes and calculate the odds at each step.

I've added an image to the bottom of my previous post that attempts to explain this:
http://www.angrycrying.com/2006/02/monty-hall-problem-visualize-it-as.asp


Alan Bellows
Posted 12 February 2006 at 11:43 pm

I have created a simple Javascript simulator which continually runs simulations based on randomly generated variables. The source Javascript is available for download. Check it out here.


Didactus
Posted 13 February 2006 at 08:27 am

Sleepyrohan,

If you would like to send me your C++ code, I'll tell you where the bug is.

You asked for a clear logical explation of the why the odds specifically go to 2/3. Here's the way I think about it.

Let's say you initially pick door 1. The prize is hidden behind door P. Let's analyze how often a switching strategy wins, based on the three possible, equally-likely cases for P.

P=1: You can switch to 2 or 3, depending on what Monty does, but both ways lose. Chances of winning with a switch: 0%.

P=2: Monty *must* pick door 3, since you've picked door 1, and the prize is behind door 2, as Monty knows. So a switch will pick 2, the only remaining door. Chances of winning with a switch: 100%.

P=3: Monty *must* pick door 2. So, again, chances of winning with a switch: 100%.

Given this, your chances of winning are 2/3, because in 2 of the 3 cases of where the prize is, you win 100% of the time.


sleepyrohan
Posted 13 February 2006 at 08:57 am

Thanks you all. Now i have finally understood the logic. And i also found the error in my program (everyone makes mistakes) and i got the correct results,so sorry for troubling you all with that.


wart35
Posted 15 February 2006 at 08:58 pm

Ok, I finally understand. I'll try to explain...

So, if you're a viewer of the show and watch many episodes where the contestant can switch, than the results that you will see will be:

For the people that choose to switch will have a 2/3 chance of winning because you only have a 1/3 chance of choosing correctly in the beginning, leaving you with a 2/3 chance of winning with a switch.

But, if you are a contestant on the show, after the first door revealed to be nothing, you only have a 50/50 chance of choosing correctly because you are faced with only 2 choices. I think, if I'm wrong say something.


Alan Bellows
Posted 15 February 2006 at 09:57 pm

wart35 said: "But, if you are a contestant on the show, after the first door revealed to be nothing, you only have a 50/50 chance of choosing correctly because you are faced with only 2 choices. I think, if I'm wrong say something."

Both the person at home and the contestant have the same odds by the reasoning you described in your first paragraph. Regardless of which doors are opened to reveal what, when three doors are present, a person always has a 1/3 chance of having guessed right with their first door, and a 2/3 chance of having guessed wrong. Switching doors is simply discarding the 1/3 odds in favors of the 2/3 odds.


wart35
Posted 15 February 2006 at 10:09 pm

Yea, you're right... It's slightly over my head, but I'm beginning to grasp it.


noelgreen
Posted 24 February 2006 at 02:09 pm

This is easy to do with a deck of playing cards where you choose out 3 cards and one of them is the ace of spades. Have a friend help and try to choose the ace. Each time after you've chosen, have the friend show you one of the remaining cards that's not the ace... I say 'one' because it could be both are not it. Then keep your card... 33% of the time you'll get the ace.

I had made a little Flash animation / game for this problem a few years back, and it's still on my web site. Here's the link. http://www.noelgreen.com/picktheace.swf

My younger brother had a big problem with this, as do a lot of people. It's a good proof against the possibility we'll ever invent androids or AI. We can't get a computer to have faulty reasoning.

Great post! Thank you so much!!


rp2
Posted 07 March 2006 at 12:05 pm

20 with switch- 7 wins

20 with no switch - 8 wins

...


godsgrandson
Posted 16 March 2006 at 09:46 am

Alright, my turn. Imagine the 3 doors; A, B, C. A has the prize.

Now if can not change, you have 1 in 3 chance. Only if you pick A, do you win.
If you can change, and automatically do, the original choice of either B or C will win. Simple.
There are only 6 possibilites to the entire setup: Choose A, switch. Choose A, don't switch. Same for the other two. But switching gives odds of 2/3 against not switching whcih gives odds of 1/3.

Who needs computer programs?


c_s_1987
Posted 25 March 2006 at 07:26 am

I was just going to make a point, but then I realised imiJn already made it. For anyone who is still confused, I recommend you read his post. It also explains why the prize is never selected on the first door opening.


kneller
Posted 02 April 2006 at 08:36 pm

I find it fascinating how difficult the human mind finds it to comprehend probability.

A few months back there was a crisps promotion in the UK which was giving away iPod minis. The way the promotion worked was; each packet had a code, which could be entered at any time. Every 15 minutes or so, a draw would be made and one entry from that period would win.

Along with several colleagues, I wondered whether the odds of winning were greater if you entered all your codes in one 15 minute period or separately. It seemed intuitive to most people that entering them all at once gave you greater odds - and indeed it gave you better odds of winning *in that period*...

Unfortunately, your odds overall were better if you entered 1 in each period - not only do you have a better chance of winning an iPod, but you also have a (smaller) chance of winning a whole bunch. It was only when considering a scenario with only one other entry per period that this became obvious:

- If you enter 2 in one: your odds are 2/3
- If you spread the codes: your odds are 1! ( 1/2 + 1/2 ) of winning *something*
- If you spread the codes: you also have a 1/4 chance of winning 2 iPods! (1/2 * 1/2)


Mineh
Posted 04 April 2006 at 07:47 am

Hi! I just joined, since I read this article it was too damn interesting for me not to comment lol. Anyways, I made the easiest ways that I can explain Monty Hall's problem. I hope it helps.

-----------------------------------------------------------------------------------------------
You decide to play a game with your friend to see if she can find a dollar bill.
There are 3 notebooks on a table: (A), (B), (C). Notebook (C) contains the dollar bill.

You leave the notebooks closed and tell your friend to pick a notebook she thinks contains the dollar bill. She picked notebook (A). Since there are 3 notebooks, every notebook has a 1/3, chance of containing the dollar. Therefore, (A) has a set chance of 1/3. This will not change.

So here you have (A), (B), and (C). You make it easier for her by taking out of the game 1 notebook that is sure not to have the dollar.You know (C) has the dollar so you take (B) out of the game.

There was a 2/3 chances of (B) or (C) containing the dollar. Now she knows out of (B) and (C), does not contain the dollar, (B). With (B) gone, the chances of (C) containing the dollar is set to 2/3. Therefore, it would be wise for her to change her choice to notebook (C).

People who do better with pictures might understand it better with a simple pie chart, which I just realised. I made one at this link:

http://i48.photobucket.com/albums/f222/jetaqua/monteyhallchart.jpg

Or maybe for people who are more mathematically inclined would understand it in a equation better:
A+B+C=3/3
A=1/3(notebook seperated from the equation because it was picked out)
so
B+C=2/3
then
B=0(was confirmed to have no dollar so it equals zero)
so C=2/3
while A=1/3
Meaning B had a zero chance of containing the dollar, C had a 2/3 chance of containing the dollar, while A had a 1/3 Chance of containing the dollar.
-----------------------------------------------------------------------------------------------
Well thats it of my explination, please tell me if you see any errors in my math part or I missed stuff because its 4:37 and I can't believe I spent two hours on this. But I'm pretty sure it's correct overall. Lol wow gotta wake up for school in 2 hours and a half.. its crazy! Oh well it was worth it I actually attempted to excercise my brain. Well bye everyone!


Mineh
Posted 04 April 2006 at 08:04 am

[[[[[[[[ REVISED VERSION ]]]]]]]]

Hi! I just joined, since I read this article it was too damn interesting for me not to comment lol. Anyways, I made the easiest ways that I can explain Monty Hall's problem. I hope it helps.
———————————————————————————————–
You decide to play a game with your friend to see if she can find a dollar bill.
There are 3 notebooks on a table: (A), (B), (C). Notebook (C) contains the dollar bill.

You leave the notebooks closed and tell your friend to pick a notebook she thinks contains the dollar bill. She picked notebook (A). Since there are 3 notebooks, every notebook has a 1/3, chance of containing the dollar. Therefore, (A) has a set chance of 1/3. This will not change.

So here you have (A), (B), and (C). You make it easier for her by taking out of the game 1 of the remaining notebooks (B) or (C) that is sure not to have the dollar.You know (C) has the dollar so you take (B) out of the game.

There is a 2/3 chance of (B) or (C) containing the dollar. Now she knows that out of (B) and (C), (B) does not contain the dollar. With (B) gone, the chances of (C) containing the dollar is set to 2/3. Therefore, it would be wise for her to change her choice to notebook (C), since the probability of C containing the dollar is higher.

People who do better with pictures might understand it better with a simple pie chart, which I just realised. I made one at this link:

http://i48.photobucket.com/albums/f222/jetaqua/monteyhallchart.jpg

Or maybe for people who are more mathematically inclined would understand it in a equation better:
A+B+C=3/3
A=1/3(notebook seperated from the equation because it was picked out)
so
B+C=2/3
then
B=0(was confirmed to have no dollar so it equals zero)
so C=2/3
while A=1/3
Meaning B had a zero chance of containing the dollar, C had a 2/3 chance of containing the dollar, while A had a 1/3 Chance of containing the dollar.
———————————————————————————————–
That's it of my explination, please tell me if you see any errors in my math part or I missed stuff because its 4:37 and I can't believe I spent two hours on this. But I'm pretty sure it's correct overall. Lol wow gotta wake up for school in 2 hours and a half.. its crazy! Oh well it was worth it I actually attempted to excercise my brain. Well bye everyone!


Bobt250
Posted 07 April 2006 at 06:42 am

The best explanation I've seen is here and I'm now a believer

http://www.comedia.com/hot/monty-answer.html


justdig
Posted 15 April 2006 at 03:17 pm

Here is a very VERY simple explanation:

If you pick the correct door initially, then Monty opens another, the one left over will be incorrect.
THEREFORE- if you initially pick the correct door, switching will give you the incorrect door.

Conversely, if you pick an incorrect door initially, then Monty opens another incorrect door, the one left over will be correct.
THEREFORE- if you intially pick an incorrect door, switching will give you a correct door.

Now, since the chance of picking a correct door initially is 1/3, and the chance of picking an incorrect door initially is 2/3, this means that switching will give you a 2/3 chance of getting the right door.


sulkykid
Posted 19 May 2006 at 01:34 pm

kneller said: "I find it fascinating how difficult the human mind finds it to comprehend probability.


A few months back there was a crisps promotion in the UK which was giving away iPod minis. The way the promotion worked was; each packet had a code, which could be entered at any time. Every 15 minutes or so, a draw would be made and one entry from that period would win.

Along with several colleagues, I wondered whether the odds of winning were greater if you entered all your codes in one 15 minute period or separately. It seemed intuitive to most people that entering them all at once gave you greater odds - and indeed it gave you better odds of winning *in that period*…

Unfortunately, your odds overall were better if you entered 1 in each period - not only do you have a better chance of winning an iPod, but you also have a (smaller) chance of winning a whole bunch. It was only when considering a scenario with only one other entry per period that this became obvious:

- If you enter 2 in one: your odds are 2/3
- If you spread the codes: your odds are 1! ( 1/2 + 1/2 ) of winning *something*
- If you spread the codes: you also have a 1/4 chance of winning 2 iPods! (1/2 * 1/2)"

OK, kneller, you have a mistake here. Without getting into it too heavily: When you spread the codes, you obviously do not have a probablity of 1. Can you not envision a losing sequence? The odds are computed differently for the 2 methods. For the "spread" method, it is the coin flipping scenario, the odds are 50% for one "hit" and 25% for a double "hit". Hence 75% to win, 25% to lose.

As for Alan's "Gambler's folly", well, I will gladly take the 1,267,650,600,228,229,401,496,703,205,376 to one odds that the next coin flip is also heads. Do I need any explanation at all?


Woodman
Posted 30 June 2006 at 06:58 am

sulkykid said:

As for Alan's "Gambler's folly", well, I will gladly take the 1,267,650,600,228,229,401,496,703,205,376 to one odds that the next coin flip is also heads. Do I need any explanation at all?"

Acctualy, thats not correct. The chance of getting 100 heads in a row and getting 99 heads in a row and then one tails is exactly the same (the number you stated), you would also have the exact same chance to get for example 37 heads, one tail and then 62 heads again, any clearly defined combination has the same chance. This is also hard to grasp but as opposed to the monty hall problem in this case each flip is not in any way affected by the last flip, the last flip(s) cannot in any way alter the outcome of the next one.

The key that made me understand the monty hall problem was realizing that Im not the only one making choices, the host is also making a choice to open another false door. (Altough, it has to be defined that way for the problem to work as it does.) That statement probably just messed up your thinking but it worked for me.


justdig
Posted 05 July 2006 at 02:57 pm

By the way, the explanation of the Gambler's Folly is just plain wrong. While everything you've posted is true, the Gambler's Folly isn't because any given sequence is as likely as any other, those are just two unrelated, thought both somewhat coutner-intuitive, ideas in probability. The Gambler's Folly is just a demosntration that probabilities of what are going to happen are independant of what has come before.


Threepwood
Posted 07 August 2006 at 10:51 am

I'd like to say that I found the article really damn interesting. I know very little about math but I've read two other articles by Mr. Bellows on the subject (Benford's Law and The Birthday Paradox) and thoroughly enjoyed both. The qualities of mathmatics that irritate the author (mostly for comic effect, I assume!) are those which I think make it interesting: that the truths revealed to us by mathmatics can be preposterously out of sync with common sense, yet at the same be 100% true and provable.


Implode
Posted 07 December 2006 at 11:15 am

Going with Mineh's mathematical appreach to it, I'd like to start out by saying you missed some stuff, just as I am undoubtedly about to as well. I'd like to show a computation. So, first of al you have three doors, A, B and C. Behind one of them is a grand prize. The chance for it to be hiding behind each of the doors is equal for alll three of them. Thus: A=B=C. You can choose either of them now. Now, I'm going to drop out a random letter, in this case C. You then get the formula A=B, thus showing that the odds of getting the prize is equally big if you switch or if you don't. This, however, is not the Monty Hall problem. In that, Monty picks only the goat/donkey/whatever you haven't picked (if you have picked one), and doesn't choose between them randomly. If Monty didn't know what door you picked, and could only choose one of the two "goat doors", I believe that your chances of winning by switching would always remain 1/2, whether he picked your door or not. I think this is what most rational people mistake for the Monty Hall Problem, but in that, he would never touch your door, "thus" making the chances of winning if you switched 2/3. I hope this helps.


Implode
Posted 07 December 2006 at 11:18 am

Looking back, I realize that dropping a random letter out of the A=B=C isn't the same as the Monty choosing one of the non-prize doors, but nevertheless shows another way of doing the problem that would be wrong. It still isn't Monty Hall.


CyberClaw
Posted 18 December 2006 at 10:43 am

Mathematically speaking, if you stick to the initial pick you have 1/3 chances of being currect. No one see no flaws here right? Then that must mean if you change doors you have 3/3 - 1/3 chances of being currect (2/3), since the host eliminated 1 and there is only 1 door remaining.

For people who know a little about programing, here is how it works:

-If you stick to your initial pick-
you pick betwen 3 doors
host eliminates 1 door of the 2 remaining
you stick to your inital pick
if door you pick == random prize door, you win (1/3rd chance)
else you loose

-If you choose the remaining door after the host eliminates one-
you pick betwen 3 doors
host eliminates one of the doors remaining
you change to the door not eliminated
if door you had initially picked == random prize door you loose (in the above case we saw this had a 1/3rd chance of happening)
else you win (which means by you have a 2/3rd chance of winning since the initial "if" has a 1/3 chances of being true)


Drakvil
Posted 03 January 2007 at 01:48 am

So is there an update for this involving the "Deal or No Deal" problem?


misanthrope
Posted 03 January 2007 at 08:18 am

You can apply maths to Deal or No Deal to see if the banker's offer is good or bad. You can just average the amounts left and compare that to the offer. I'll simplify by leaving it until the last round when you have two boxes and an offer. One box is £10, the other is £10,000. You have a 50/50 chance of getting each amount, so multiply the figures by .5 and add them, giving you £5,005. That's the same as adding them all together and dividing by the number of boxes.

Expand the choice to 5 boxes, say £1, £10, £100, £1,000 and £10,000 and you have an average of £2,222.20. ((1+10+100+1000+10000)/5).

Of course, there's more to that game than maths. The banker works to the 'bird in the hand' principle, in that he offers under the average (about 2/3 of average, judging by the few times I've seen it). Going back to the first example, he'll usually offer about £3,000 for a choice of £10 and £10000. The rational choice is not to deal, but faced with the choice between a guaranteed £3,000 or the possibility of leaving with less money than it cost you to get there, rationality is hard to stick to.

Mind you, there would be little point (from a financial point of view) in having the banker there if he were to just offer the average each round. In the long term, it wouldn't save them any money. By offering under the average, they're saving themselves a few quid in the long run. In TV-land though, the reason is to make the choice harder to make, to make the gamble seem more attractive and to build tension. Tension = viewers.

---

On another note: anyone else think that the comments in this thread once again demonstrate the old principle that the less you know, the more convinced you are that you know everything?


mwillmek
Posted 27 February 2007 at 12:13 am

The mathematical solution described for this problem in the article is deceiving. The calculations may be correct but they don't correctly fit the problem.

Initially the odds of winning are 1/3. The extra chance of winning comes not from the contestant actually changing their guess but from being given the opportunity to change. Essentially they are given a new guess with different odds.

Under the "rules of the game" after the first guess a non-winning prize is *always* revealed and the contestant is then *always* given a new guess with only two doors left. This means the first guess amounts to showmanship and the actual odds of winning the prize were always 50/50.


sulkykid
Posted 27 February 2007 at 03:14 pm

mwillmek said: "The mathematical solution described for this problem in the article is deceiving. The calculations may be correct but they don't correctly fit the problem.


Initially the odds of winning are 1/3. The extra chance of winning comes not from the contestant actually changing their guess but from being given the opportunity to change. Essentially they are given a new guess with different odds.

Under the "rules of the game" after the first guess a non-winning prize is *always* revealed and the contestant is then *always* given a new guess with only two doors left. This means the first guess amounts to showmanship and the actual odds of winning the prize were always 50/50."

NO! NO! NO! Arrrrrgggghhhhh!


misanthrope7
Posted 27 February 2007 at 06:54 pm

Mwillmek: Please, re-check your logic before Sulkykid's head explodes. He is right after all, as is the article.


sulkykid
Posted 28 February 2007 at 09:33 am

OK, it is like this: when you choose, you have a 1/3 chance for the big prize, the un-chosen prizes have 2/3. Someone, WHO KNOWS THE HIDDEN PRIZES, tells you that you can have the better prize of the of the 2/3 chance. Which is the better deal?


mwillmek
Posted 28 February 2007 at 10:46 pm

misanthrope7 said: "Mwillmek: Please, re-check your logic before Sulkykid's head explodes. He is right after all, as is the article."

The moment passed. No exploding heads I hope.


Zaphodile
Posted 06 October 2007 at 04:57 am

When I read this I was screaming "NO NO NO NO NO!" internally, because of course it doesn't make sense. Because it's not right! You never state in the article that Monty knows what all the boxes contain and always chooses the empty ones. You have to know the rules to understand the problem, and since I've never seen this show, how could I understand automatically?
With this rule it actually makes perfect sense, and I think any 5 year-old could understand the logic in it. This rule reduces this problem to the problem "you pick 1 box out of 100, what is the chance of you picking the right one, versus the chance that the other 99 boxes contains the right one?".


mbhatnagar
Posted 30 May 2008 at 01:49 am

Allan has given a pretty good explanation here. I had read this on Wikipedia earlier, but didn't completely understand it.


Sacred Junk
Posted 29 July 2008 at 11:26 pm

sulkykid said: "Under the "rules of the game" after the first guess a non-winning prize is *always* revealed and the contestant is then *always* given a new guess with only two doors left. This means the first guess amounts to showmanship and the actual odds of winning the prize were always 50/50.""

i was just going to post something similar
at first you have a 50 / 50 chance of choosing the right door
if you had a 1/3 chance, then 1/3 times the first door opened should have been the right door.. which is not the case
but since the first door opened is always the wrong door, you initial odds are 50%
even after the first door is opened, the odds of both the doors remain at 50%


Erik B.
Posted 18 September 2008 at 12:56 pm

Does not compute.

However, this one is easy:

''Imagine that you flip an ordinary coin 99 times, and amazingly, it comes up heads every time. What are the odds that it will come up heads again on the 100th flip?''

One hundred percent. The coin is doctored, duh!


BenKinsey
Posted 01 October 2008 at 07:07 am

Very interesting I didn't understand at first but due to the many excellent examples I get it. Thanks Alan.


tikigod4000
Posted 05 January 2009 at 10:57 am

MY BRAIN!!! MY BRAIN!!


howardfrankfort
Posted 14 February 2009 at 10:42 am

The least likely senario oddswise is you have chosen the winner @ 1 of 3 choices. If the odds favor you have chosen a goat in step 1, accept that you probably have. What doors would this senario leave Monty? A winner and a looser. He cannot choose the winner, so if odds are your only indicator, and they favor you have chosen a goat, HE had no choice to make, you forced him to eliminate the other goat. So if odds are you chose a goat to begin with, and you know what that would have forced Monty to do, what door would you pick? The switch takes into account your choice and Montys choice. Instead of using your odds of 33% getting the prize in step 1, use your odds of 66% of picking a goat in step 1 plus if that happend, 100% odds of Monty picking the other goat for you.


tactilejones
Posted 18 April 2009 at 04:05 pm

Okay, head hurts, nausea setting in, and yet I'm compelled to exacertbate the situation by tediously registering, logging in, and commenting via T9 text from my cellphone!! So, I think I get the concept, initially... switching to the 2/3 odds from the 1/3 odds and all that. But then, consider this scenario: No previous gameplay. Two doors, one contains a prize, one contains nada. You "own" one door. You're given the choice to switch doors, or not, before the prize-containing door is revealed. Whether you switch or not, your chances are 1/2. Other than the "no previous gameplay", this is the scenario you're presented with in Monte's game, no? So if previous odds and outcomes are not relevant, as according to the Gambler's Dillemma (as presented here), shouldn't "pick one of two doors randomly containing a prize" remain at 1/2 odds, regardless of whatever silly game you were playing a moment before? HELP!! I will not eat or sleep until I understand this!


sulkykid
Posted 18 April 2009 at 05:53 pm

Previous odds and outcomes ARE relevant as any horseplayer knows.


Bob Nesbo
Posted 09 November 2009 at 02:48 am

I believe Edgar Allen Poe had an entire dissertation regarding something similar to this. I read it years ago, and I will have to search it...


Alucin Veritas
Posted 26 December 2009 at 01:23 am

An inversion of thought might provide clarity. Instead of having the probability of being right, look at the probability of being wrong. Initially, there is a two thirds probability of being wrong. Then, in the switch there is a one in two probability of being wrong. Since the two probabilities are connected, they are multiplied. The result is that if a switch has occurred, then there is a one in three probability of being incorrect.


cornflower
Posted 28 April 2011 at 06:01 pm

I'd like to point out what I think about the coin tossing question. The question "What are the odds that..." is the key here, and is what may make the problem misleading.
It makes a difference when you say: "What are the odds that the coin will come up heads on the next (100)th flip?" and when you say: "What are the odds that you will get a series of 1oo coin flips that come up all as heads?" The illusion is in that we tend to think of the second formation of the question rather first, as our brain tends to recognize stark patterns more quickly, as the article says.


cornflower
Posted 28 April 2011 at 06:02 pm

*rather than the first


Evil1
Posted 25 January 2012 at 11:01 am

If there were 1 Billion doors to choose from, and I picked one randomly that I thought may contain the prize, I would have a 1 in 1 Billion chance of picking the right door first time (very little chance of success). If the host then opened 999,999,998 doors that HE knew did not contain the prize (leaving my original picked door and only one other door), that would mean that either my door was right (on first choice) - or the only remaining door he has left unopened is right and the odds would suggest I should swap to his door. Much better odds than the three doors but the principle is the same, unless someone sees a flaw in my reasoning?


Perry Curling-Hope
Posted 06 May 2012 at 03:57 am

People make this ‘problem’ hugely complicated, running simulations and introducing all manner of irrelevancies to try to ‘prove’ that the odds prevail.
They do, and always will, by simple logic, unless some metaphysical precognition is at work to confound them, but that is not the issue at hand here.

If the contestant applies the strategy of not switching, the odds are a simple 1 in 3 of picking the car, whatever the host does is irrelevant as it will have no bearing on the result.

If the contestant applies the strategy of switching, the odds are still a 1 in 3 of initially picking the car, and the contestant will lose, as the host will reveal the one goat, and the contestant will migrate to the other goat.
The odds of the contestant initially picking a goat are of course 2 in 3, and the contestant will win, as the host will reveal the other goat, and the contestant will migrate to the car.

It is only the initial selection by the contestant which has anything to do with randomness and probability, after that, it plays no part, and the outcome is determined.
The host has to know where the car is, and is presumably constrained to revealing a goat after the contestant’s choice for the ‘Monty Hall Problem’ to prevail


Jay M.
Posted 20 August 2014 at 09:08 am

This is, in my view, the simplest way to look at it.

Two-person card game between Bob and Sue. Three-card deck with one marked card, shuffled into random order.

STEP ONE: Sue picks one card, which she must place face-down on the table without looking at it.

STEP TWO: Bob gets the other two cards. He looks at his cards and is required to discard one blank card, face up.

STEP THREE: Prior to the cards being turned over, Sue is given the choice of keeping her card or swapping with Bob’s remaining card. Should she swap?

ANALYSIS: Sue should swap, because their original chances of having the winning card (1 in 3 for Sue, 2 in 3 for Bob) have not changed by virtue of Bob’s revealing his blank card. Whether he does or doesn’t have the winning card, he will always have a blank card to show, so this is just theater to confuse the issue.

[In the actual TV show, Monty Hall did not have to give the contestant the choice of switching, which changes the odds. I am not accounting for that here.]


END OF COMMENTS
Add Your Comment

Note: Your email address will not be published, shared, spammed, or otherwise mishandled. Anonymous comments are more likely to be held for moderation. You can optionally register or login.

You may use basic formatting HTML such as <i>, <b>, and <blockquote>.