Games
Problems
Go Pro!

Ask Professor Puzzler


Do you have a question you would like to ask Professor Puzzler? Click here to ask your question!
Category results for 'probability'.

Before getting to the question for this morning, I'd like to direct your attention to our simulation of a game show with three doors. It's called the "Monty Hall Game." Understanding this game will be important to thinking about Tracie's question below.

The short summary of the game: There are three doors, and only one of them has a prize behind it. You pick a door. Then Monty opens another door to show you that it's empty, and asks you if you want to switch your guess. What do you do?

Tracie from South Dakota asks, "So if the game show host opened 2 doors for me, would I still have 1/3 probability of winning?"

Hi Tracie, your question highlights a couple of the things about the Monty Hall game that often causes confusion. 

  1. Monty Hall knows in advance where the pot of gold (or new car, or whatever) is. The rules of the game are that he opens an empty door. But if he always opens an empty door, that means he knows where the empty doors are. That's an important concept to understand. He's not being entirely random in his choice of doors to open. There are two empty doors, so if you pick one of them, he picks the other. The only time he's being random is when you pick correctly; in that circumstance he randomly picks one of the empty doors to open.
  2. This is not a static problem - the circumstances change, and because the circumstances change, the probabilities could change as well. Let me give you an example. Suppose that during Christmas vacation, we decide we want to take our kids sledding. Since I'm not teaching that week, I can pick any day of the week to go. So we would say that my probability of picking Tuesday is 1/7 (there are seven days in a week, and Tuesday is one of them). Now suppose that my wife looks at a weather forecast, and says, "Oh! Thursday, Friday and Saturday are supposed to be cold, with a wind chill of -20 F!" If I use that information in making my choice, what is my probability of picking Tuesday now? It's 1/4, because we've eliminated 3 of the 7 days. That doesn't mean the original 1/7 is wrong - it means that the conditions of the problem have been changed, and so we have to calculate a new probability. This is not, by the way, a perfect analogy to the Monty Hall problem, so please don't try to make it match. The differences are:
    1. We're talking about the probability that I'll pick a certain day, not the probability that the day I pick is a good one.
    2. We don't know that there's only one good sledding day.
    3. My wife is not deviously hiding known information from me.
    4. Weather forecasts are not 100% accurate anyway.
    5. There is no #5, but at the end of this post, I'll - just for fun - turn this example into a better analogy for the Monty Hall Game.

Okay, so how does this relate to your question?  Number two should be fairly obvious; we have a change in conditions, so there's no reason to assume that the probability will stay the same. The probability of you winning WAS 1/3, but the changing conditions mean we have to recalculate the probability.

But here's the more important issue. Monty Hall can't play the game the way you suggest. Why not? Because if the rules of the game are "Monty will open two empty doors," Monty is going to run into trouble every time you don't pick the right door. Because if you pick the wrong door, how many empty doors are there left for Monty to open? Only one! If you pick one of the empty doors, then there's only one other empty door for him to open.

If he opens two doors, that means you've picked the correct door.  So even though your original choice was 1/3 probability of winning, under the new circumstances, you have a probability of 1 (100% chance) of winning.

If you're wondering how that number 1/3 fits into the solution it fits like this: The probability that Monty can open two doors is 1/3 (the same as the probability that you selected correctly). So the number 1/3 is still in there - just in a different place!

If you're wondering why the probability changes in this case, but doesn't when he opens one door, the answer is this: When he opens just one door, he has not given you any information about the door you opened. When he opens two doors, he has (indirectly) given you information about your door.

To add to this, the only way Monty could do the two-door rule would be to have to have a pair of rules:

  1. If you pick the right door, he will open two doors. (this happens 1/3 of the time)
  2. If you pick the wrong door, he will open one door. (this happens 2/3 of the time)

The problem with this is: if you know the rules, you can do an always-winning strategy. If he opens two doors, you've automatically won, and if he only opens one door, that means you picked the wrong door, so you must swap.  Monty would DEFINITELY not want to play by those rules!

Addendum: If you would like to look at a couple different ways of understanding the Monty Hall problem, you can find more write-up here: The Murky Swamp of Probability.


Sledding in December. Let's say my wife looked at the weather forecast and told me, "Every day but one this week is going to have a wind chill of -20 F, and there will be one day that's going to be sunny and warm. So, randomly pick a day for sledding." (My wife hates the cold, so she would definitely not do something as ridiculous as that!) We'll make the wild assumption for this example that the National Weather Service has 100% predictive accuracy.

So I pick Tuesday. I have a one-in-seven chance of "winning," based on the information I have. Of course, my wife has a different perspective, because she has more information than I do. She knows with 100% certainty whether I've picked correctly or not!

Now my wife says, "Okay, I'll tell you that the following days are going to be -20 F: Sunday, Monday, Thursday, Friday, Saturday."

She's narrowed my choices down to two possibilities. But it's important to remember that this was not a completely random choice. She didn't pick 5 out of 7 days to eliminate; she picked 5 out of 6! She couldn't eliminate Tuesday, because that was the day I had picked, and if she eliminated it, I would be forced to change.

My reasoning now goes as follows: Based on my original information I was given, Tuesday had a 1/7 probability of being the best day for sledding.  That means there was a 6/7 probability that one of the other days was the good sledding day. So there's a 6/7 probability that the good sledding day is: Sunday, Monday, Wednesday, Thursday, Friday or Saturday. As a probability equation, I could write:

Psu + P+ P+ Pth + P+ Psa = 6/7

But now my wife has changed the problem. She's changed it by giving me more information: She's told me the values of all but one of those probabilities is zero!

That means: 0 + 0 + Pw + 0 + 0 + 0 = 6/7, which means that Pw = 6/7!

So do I switch my decision to Wednesday? You bet I do! The key to understanding this is that my wife's choice of information to give me is not random, and it therefore significantly alters the shape of the problem.

I hope that this is a helpful answer. The Monty Hall problem is one that repeatedly trips people up. I wonder if my blog post will increase the probability of people not getting tripped up by it!

PS - if you go to the Monty Hall game and change the game settings to 7 doors, 5 hints, that will be the same as my sledding example. Turn on the automation, and watch the percent. If you let it run long enough, you'll see that it eventually settles down to about 85.7%, which is the 6/7 probability we calculated.

Vusi from South Africa wants to know - if you roll a blue die and a yellow die, why are these considered to be independent events?

Well, Vusi, let's start off by making sure we understand the terms "dependent events" and "independent events."

Dependent Events
Two or more events are dependent if the outcome of one of the events will affect the outcome of the other.

For example, if you have a jar that contains 100 blue jelly beans and 100 yellow jelly beans, and you draw one jelly bean and eat it. The probability that the jelly bean will be blue is 0.5 (100/200, because there are 100 blue jelly beans and a total of 200 jelly beans all together). However, if you then draw a second jelly bean from the jar, the probability of it being blue is not 0.5. In fact, we really don't know what the probability is. Why? Because if the first draw was a blue, the probability will be 99/199 (because we have one less blue, and one less total), but if the first draw was yellow, the probability of a blue on the second draw will be 100/199 (because we still have 100 blues, but one fewer yellow). Thus, the probability of the second event can't be calculated without knowing the result of the first event. The first event affects the outcome of the second event. These are therefore dependent events.

Independent Events
Two or more events are independent if the outcome of one event has no effect on the outcome of another.

To keep this simple, let's talk about the same jar. Only this time, instead of eating the jelly bean, you put it back in the jar after drawing it out. The probability that the first jelly bean drawn is blue is 0.5. But what about the second drawing? In the second drawing, you still have 100 blue jelly beans and a total of 200, because the first jelly bean drawn was put back in. Thus, it makes no difference what you draw the first time; the probability for the second draw being blue is still 0.5. The first draw does not affect the second draw, so these are independent events. 

Hopefully, with this in mind, you can answer your own question. Are the rolls of a blue die and a yellow die independent? Yes they are! Because the roll of the blue die is not going to affect the roll of the yellow die. The yellow die isn't going to think, "Oh, the blue die is showing a 3, so I better not land on 3," or "The blue die is showing an even number, so I should show an odd," or "Hey, the blue die is showing a 6, so if I land on 6 we'll have doubles!" Whatever happened with the blue die has absolutely no bearing on what happens with the yellow die. Therefore, these are independent events.

By the way, we have a reference unit on probability on this site, which you can find here: Probability concepts and problems. This unit contains a section on independent events and dependent events, so you can read more on that here: Independent Events, Dependent Events.

Thanks for asking!

Mathi, from Vellore, wants to know how to figure out the following game show probability:

"On a game show, a contestant is given three keys, each of which opens exactly one of three identical boxes. The first box contains $1, the second $100, and the third $1,000. The boxes are randomly lined up and the contestant gets to assign each key to one of the boxes. The contestant wins the amount of money contained in each box that is opened by the key he assigns to it. What is the probability that a contestant will win more than $1,000?"

I'm not going to answer that exact question, because I think we can make it more challenging and interesting by changing the numbers a bit. Let's do this one instead:

"On a game show, a contestant is given four keys, each of which opens exactly one of four identical boxes. The first box contains $250, the second $500, the third $750, and the fourth $1,000. The boxes are randomly lined up and the contestant gets to assign each key to one of the boxes. The contestant wins the amount of money contained in each box that is opened by the key he assigns to it. What is the probability that a contestant will win more than $1,000?"

First, we need to figure out how many ways the keys can be arranged. The first key can be assigned in 4 ways, the second one in 3 ways (since one key has already been placed), the third key in 2 ways, and then there's only one way to place the last key. That gives us a total of 4 x 3 x 2 x 1 = 24 ways. So if we can figure out how many possibilities are wins, all we need to do is divide that by 24 to get the answer.

It's a win if you place all the keys in the right position. There is one way to do that.

It's a win if you place $1000 and any one of the others correctly. (Note that you can't place $1000 and two others correctly, because if you place three of them correctly, the fourth one must be correct as well, and we've already counted that possibility!). So there are three ways to do that.

It's a win if you place the $500 and $750 correctly (but not the other two, since we've already counted that possibility!). You can do that in one way.

And there are no other combinations that work. Thus, we have a probability of (1 + 3 + 1)/24 = 5/24.

Now that we've gone through that, you'll be surprised at how easy the other problem is - there are far fewer combinations to consider!

Good luck!
Professor Puzzler

Today's question is a probability question from Sue.  Her question is about how to calculate the probability of winning a game at least a certain number of times.

Let's say the probability of Sue winning Fraction Concentration against the computer is 3/4.  If she plays 8 times, how do you calculate the probability that she'll win at least 6 times?

If she wins at least  six times, that means she could win 6, 7, or 8 times.

So we're going to have to work out the probability that she wins exactly 6 times, the probability that she'll win exactly 7 times, and the probability that she'll win exactly 8 times.  Once we've found those three probabilities, we add them together to get the total probability.

Note that in the equations below, 3/4 is the probability of a win, and 1/4 is the probability of a loss (since 1 - 3/4 = 1/4).

Six wins: (3/4)^6 x (1/4)^2

Seven wins: (3/4)^7 x (1/4)

Eight wins: (3/4)^8

Now, depending on who your teacher is, and whether or not they like to torture you with fractions, you'll either have an ugly fraction, or a decimal answer.  I came up with approximately 0.1446.

Now that you know the process, you can fill in any numbers you like. Make the probiblity of winning 0.05.  Make the number of games 20. Whatever you like.

Helpful tip: Suppose you played 20 games, and wanted the probability of winning at least 5.  Using this method, you'd have to find the probability of winning 5, 6, 7, 8,...18, 19, or 20 games. Not pretty.

So instead, Find the probability of LOSING no more than 4.  That's a much easier probability, and it comes out to the same thing!

Happy computing!
Professor Puzzler

 

Blogs on This Site

Reviews and book lists - books we love!
The site administrator fields questions from visitors.
Like us on Facebook to get updates about new resources
Home
Pro Membership
About
Privacy