Wikipedia:Reference desk/Archives/Mathematics/2012 December 26

= December 26 =

Deal or No Deal?
What is the probability of picking the 1million dollar box out of the 25? and What is the probabilty of 'NOT' picking the 1million dollar box after 24 tries? I hope someone gets what im trying to ask. not a native american speaker here 203.112.82.129 (talk) 22:44, 26 December 2012 (UTC)


 * If there are 25 boxes, one of which contains the million, and your choices are truly random, then you have a 1/25 chance, or 0.04 probability, of picking it each time. If you pick all but one box, independently, then you have a 1-0.04, or 0.96 probability.  That's a 96% chance of picking it in the first 24 tries, and a 4% chance of missing it. StuRat (talk) 23:08, 26 December 2012 (UTC)


 * So i guess what i want to know is, if there is 2 box left, the one that you picked and the last unpicked box, and whats left is $1 and 1 million dollar, is it better to choose deal or no deal? 203.112.82.129 (talk) 23:13, 26 December 2012 (UTC)


 * The value of the remaining boxes is $1,000,001, and your expected return from picking one is therefore half that, or $500,000.50. So, if the banker offers you more than that, take it, based on math alone.


 * However, there are other concerns, like taxes and how much money you really need, that should also affect your choice. If the banker offers you $400,000, most people would be happier with that as a sure thing than a 50-50 shot at a million, as $400K might be enough to solve all their financial problems. StuRat (talk) 23:39, 26 December 2012 (UTC)


 * A rich mathematician (do they exist?) who aims to maximise their return on the game would always say "no deal" since the banker's offer is always less than the expected value of the remaining boxes. In the UK version, there is some random and psychologically-aimed variation in the offers, but they tend to become closer to the expected value as the number of boxes reduces.  Real people make "illogical" choices for the reasons stated by Stu above.  They only get one opportunity to play the game, so "mathematical logic" doesn't apply since the negative consequence of refusing a good offer outweighs (psychologically and emotionally) the positive advantage of winning the maximum return.  I estimate that the organisers of the game would pay out about twice as much if everyone said "no deal".  I expect someone has done an analysis somewhere?    D b f i r s   08:35, 27 December 2012 (UTC)


 * I strongly resent the straw man implication that "Math says 50% chance of $1M is better, but the truth is that 100% of $400K is better". The mathematical answer takes the form of decision theory, in which agents don't maximize their expected money, they maximize their expected utility. Even disregarding specific tactical considerations, utility which is logarithmic in total net worth is a reasonable model. This reproduces the preference for a sure $400K for all but the richest players. -- Meni Rosenfeld (talk) 10:58, 27 December 2012 (UTC)


 * Thanks for the interesting link (which led to the Expected utility hypothesis article). Using this model, one could claim that each player makes a rational decision at each stage, based on their own risk aversion function.  I'd assumed that someone very rich would not have significant aversion to a low payout if their monetary expected return could be maximised, but even this might be an over-simplification.    D b f i r s   18:37, 27 December 2012 (UTC)


 * Also note that it's not just "psychological and emotional" factors that make the sure bet a better choice. There can be real, measurable reasons.  For example, if there's a luxury tax that kicks in on winnings above a certain level, that might make the sure bet a better net pay out.  In some game shows, you have the choice of a sure bet in cash, or a chance at a more valuable prize.  Unfortunately, you can rarely sell the prize for anything close to the "retail value", and will owe taxes on it, so you might only be able to keep a small portion of that "retail value", or even must take a loss, in some cases.  Cash is king. StuRat (talk) 19:14, 27 December 2012 (UTC)


 * Much ink has been spilled amongst psychologists over the question of whether humans are ever irrational, or if instead all seemingly irrationalities are actually a case of not fully considering the person's payoff matrix.--2601:9:1300:5B:DC71:6109:1167:7EA5 (talk) 04:46, 28 December 2012 (UTC)


 * In case readers don't understand what this is about, it's a television game show Deal or No Deal. &#x2013; b_jonas 15:08, 27 December 2012 (UTC)


 * OP here, i appreciate the replies and I think they are good answers, but it misses my point, and im the one to blame.. The answer that im looking for is just about the probability, think of monty hall problem.203.112.82.1 (talk) 13:23, 29 December 2012 (UTC)


 * StuRat answered that in his first two replies. There is no "Monty Hall" complication because no-one has prior knowledge of box contents, and "on average" if you could play as many times as you wanted, you would maximise your income by always saying "no deal".  For a single game, the analysis of expected utility explains why people often make a logical decision to deal.    D b f i r s   16:56, 29 December 2012 (UTC)


 * Here's an explanation of why StuRat's analysis is correct:
 * In the Monty Hall problem, whether your initial choice of door has the car behind it is the only factor determining whether you should switch. If your first pick has the car, the remaining unopened door necessarily has a goat; if your first pick has a goat, the remaining door necessarily has the car. Furthermore, once you have picked a door initially, you always get to see all but one other door opened, and you always get the opportunity to switch. Since you're more likely to pick a goat door than the car door, you should switch when given the opportunity.
 * In Deal or No Deal, it's still the case that your endgame strategy depends only on your initial choice of box, but you don't always get to the endgame with the million-dollar box. Given that the contestant did not pick the million-dollar box at the beginning, the probability of them opening all but one of the boxes without seeing the million dollars is 1 / (number of boxes - 1); but given that the contestant picked the million-dollar box, the probability of them opening all but one of the boxes without seeing the million dollars is 1. The conditional probabilities exactly cancel each other out and you're left with the 50-50 choice above. « Aaron Rotenberg « Talk « 17:39, 29 December 2012 (UTC)


 * Is it 50-50 at the end because the choices were random?203.112.82.2 (talk) 19:10, 29 December 2012 (UTC)


 * Yes. StuRat (talk) 19:53, 29 December 2012 (UTC)


 * Thanks!203.112.82.1 (talk) 20:22, 29 December 2012 (UTC)


 * You're quite welcome. May we mark this Q resolved ? StuRat (talk) 06:02, 31 December 2012 (UTC)


 * YES!203.112.82.1 (talk) 15:02, 31 December 2012 (UTC)