![]() |
#41
|
||||
|
||||
![]()
Unfortunately, Excel doesn't tell me what you're doing or why you are doing it.
![]() Why did you compute the "the ratio of 92.329% over 0.6% is 154"? Why would that ratio have any relevance? |
#42
|
||||
|
||||
![]()
I'm not using the standard Bayes formulas - I'm actually working this out by brute force, because I want to see the trend for various disease rates, more of a practical exercise. That ratio I mention (positive result accuracy over disease rate) tends towards the overall accuracy of the test, where the disease is rare, and the false positives dominate the true positives.
Consider this situation - make the disease rate 1 in 1 million or 0.0001% . With all other parameters the same, we have only one person (out of 1 million) with the disease, which the test catches (rounding up). There are no false negatives (rounding again). There are also 500 false positives - recall that 0.05% false positive rate. So now the rate of a positive being accurate is 1 in 501, or 0.199%. The ratio of .199% to 0.0001% is 1990:1 - or very close to the ratio of true negatives to false positives. That number is actually 1999:1 as the false positive rate is 1 in 2000. This confirms for me that my math is accurate, as this kind of goes in a circle. In any case, going back to Steve's post (#28 above), the chance of a positive result being true, depend upon both the disease rate and the accuracy of the test, and the more rare the disease, the more the accuracy (specifically, the false positive rate) matters. |
#43
|
||||
|
||||
![]()
OK, I get your point. It's neat that it tends toward that ratio (even though the ratio itself doesn't tell us anything we don't already know).
|
#44
|
||||
|
||||
![]()
The solution to Monty Hall is easier to understand if you run the numbers instead of trying to reason with it. DOn't worry.. to most people the answer that math tells them just feels wrong
Let's just assume that you ae playing the game many times, and let's see how many times you win with each strategy Game 1: Let' say you picked the right door. Monty Hall picks one of the other door. By sticking with your door you win, by switching you lose Stick with it: 1, Switch: 0 Game 2: Now you picked the first wrong door. Monty Hall picks the second wrong door. Now, if you stick with it you lose.. you switch, you wn Stick with it: 0, Switch:1 Game 3: You picked the second wrong door. Monty Hall picks the firstwrong door. Now, if you stick with it you lose.. you switch, you wn Stick with it: 0, Switch:1 So, out of 3 games sticking with it won you once, and switching one you twice. With switching you ar twice as likely to win as sticking with it. |
#45
|
||||
|
||||
![]()
For Monty Hall, and those who don't want to see the logic, change the number of boxes.
There 1 car hidden in 100 boxes, and you pick 1. I reveal 98 goats from the remaining 99 boxes and ask if you want to switch. Everyone we did this with switched. When the number is greater, it is easier to see the rationale. |
#46
|
||||
|
||||
![]()
Well, sure, but it only looks easier if you open a whole bunch of doors. How about 100 doors, one car, and Monty opens only one non-car door? It's still better to switch but it's probably even harder to see why than with the three-door problem.
|
#47
|
||||
|
||||
![]()
That's the whole premise of the original problem. All but one door is opened by Monty.
It shows the probabilities in a much clearer fashion. |
#48
|
||||
|
||||
![]()
I want to draw some pictures, but it's too much trouble...
You've got three boxes, initially with a 1/3 chance that the prize is in each box. The player picks one box, so there's a 1/3 chance the prize is in that box. Draw a circle around the other boxes - there's a 2/3 chance that the prize is in the circle, and a 1/3 chance that the prize is outside the circle. Then the host eliminates one of the boxes in the circle - there's still a 2/3 chance that the prize is in the circle (nobody's moved anything) but there's now only one box in the circle that it could be in. |
#49
|
||||
|
||||
![]()
In the three-door problem he only opens one door. In the 100-door problem whether he opens one or 98, it's still better to switch. So I wouldn't say that's part of the premise.
|
#50
|
||||
|
||||
![]()
Sure it is if you are looking at it was presented to us. It shows the fact that it is not 1 in 2 as most people in my class pegged it.
|
#51
|
||||
|
||||
![]()
If you're trying to come up with an illustration to make it clearer to somebody who can't get it from a more mathematical explanation, though, why would you complicate it unnecessarily by having 100 boxes and then only opening 2 of them? The point of UEL's example was to make the reasoning obvious, not to introduce more complication to the maths...
|
#52
|
||||
|
||||
![]()
Don't go so far as 100 doors - just go to 4, then 5 if necessary. Changing the game and leaving it as more than just a single choice is, indeed, moving the goalposts and changing the game. (How typical of you, ganzfeld, to do this.)
So with 4 doors, you had a 1 in 4 chance of choosing the prize to start, but a 3 in 4 chance of being wrong. Switching wins you the prize 3 out of 4 times. Suppose that when playing the game, you *always* pick door #1. There are only 4 scenarios: 1. The prize is behind door #1 - the one you chose. The host shows 2 other doors (doesn't matter which two) with no prize. The last door, obviously, has no prize. Switching fails to win. 2. The prize is behind door #2. The host shows doors 3 and 4 - with no prize. Switching wins the prize. 3. The prize is behind door #3. The host shows doors 2 and 4 - with no prize. Switching wins the prize. 4. The prize is behind door #4. Again, the host shows doors 2 and 3 - no prize. Switching wins the prize. So now in 3/4 cases, it pays to switch. For those who are confused by my assumption that you always pick door #1, think of it this way. The doors are not labelled until you pick a door - then that door is labelled #1, and the remaining doors are labelled in order. For 5 doors and one prize, it is still easy enough to describe all of the scenarios in one paragraph. It takes a bit of a "leap" to see it with 100 doors, as UEL describes, because you then can't work it out in "brute force" methods. But being able to visualize it really helps people to understand. |
#53
|
||||
|
||||
![]()
It's fun to talk about how the conditions matter, though. Like this variation, which is good when lots of people know the original:
Same set up as before, but after you choose your door, Monty flips a coin to decide which of the two remaining doors to open. The open door reveals a goat. Should you switch? |
#54
|
|||
|
|||
![]()
No, in that case your chances really are 50/50. Well, you can switch if you want, but you have the same odds if you stay.
ETA: I'm assuming that if the coin toss had gone the other way, and if the door revealed the prize, the game would have ended right there with the contestant losing. |
#55
|
||||||
|
||||||
![]() Quote:
When you think about "size" of a physical object you of course have to have some practical level of uncertainty, and unless you're doing some sort of nanofabrication that uncertainty is going to be orders of magnitude larger than atoms. Atoms, meanwhile are (surprisingly) not difficult to (more or less) visualize with our current level of technology (HR-TEM, AFM, etc.) and for these purposes you can simply determine the size of an atom as the distance between two neighboring atoms. But of course this means the size of the atom changes somewhat depending on what kind of structure it's in and what it's bound to. In either case, the size of an atom is the size of its electron shell, which is pretty close to spherical for most purposes. Quote:
Quote:
Quote:
The cross-section of particles is therefore best described in terms of the area over which they interact with other particles, so of course that measurement will only include those forces which are most significant to this interaction. Quote:
![]() Quote:
![]() It's reasonable to think that cross sectional area is analogous to - but not the same as - size, and that size as we think of it (or really any macroscopic properties as we think of them) just doesn't apply to things at such miniscule scales. Last edited by Alchemy; 08 February 2014 at 11:11 PM. |
#56
|
||||
|
||||
![]()
erwins/Steve - the "omniscient" host is part of the premise which creates the paradox. In the classic 3-door example, one figures that eliminating one of the 3 choices creates a 50/50 chance at winning, though switching is twice as successful, as not switching - thus the paradox.
Consider another game situation that appears - at least to me - as a paradox. Suppose you have 4 contestants to a show. The first contestant is asked a multiple choice question with 4 possible answers. If they get it wrong, the next contestant gets a chance to guess from the three remaining answers. If they get it wrong, the third contestant has a 50/50 chance. If the first three get it wrong, the last contestant wins by default. What are the odds that the fourth contestant wins? The answer is one in 4, and I had to prove it to myself like this. Assuming all answers are random, the first contestant has a 1/4 chance of being right, and a 3/4 chance of being wrong. So 3/4 of the time it goes to contestant #2. They have a one in three chance of being right, but a 2/3 chance of being wrong. It continues on much like figuring out the odds for craps. The chance of winning as the fourth contestant is a subset of the chances that everyone before was wrong - and that is 3/4 * 2/3 * 1/2 = 1/4 |
#57
|
||||
|
||||
![]()
I'm just saying that if you explain it that way someone will say, "no, you changed the rules; last time only one was opened." Then try to convince them that even when Monty only opens one of the 99 the odds are better if you switch. Have fun!
|
#58
|
||||
|
||||
![]()
I don't know. Is it a baby goat? Because they're adorable, and I'd keep it.
|
#59
|
||||
|
||||
![]() Quote:
|
#60
|
||||
|
||||
![]() Quote:
|
![]() |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
The true and the false about Eric Clapton | snopes | Snopes Spotting | 0 | 01 April 2013 09:52 PM |
Old wives' tales: true or false? | snopes | Old Wives' Tales | 2 | 14 April 2011 02:35 PM |
Whether True or False, a Real Stretch | snopes | History | 0 | 01 January 2009 11:07 PM |
Taxes: True or False? | snopes | Snopes Spotting | 0 | 20 March 2008 09:59 PM |
True or False: Urban Legend Emails | snopes | Snopes Spotting | 0 | 09 February 2007 04:40 PM |