Let me start answering my Deal or No Deal-based question by just pointing to Chiaroscuro’s answer, which does the arithmetic exactly right and comes to a quite sensible conclusion from it. This leaves me feeling like I’m not quite earning my pay here, so let me go into further depth and ask that someone pay me.

The problem as set up was this: a Contestant has selected one of five (remaining) briefcases. Inside is equally likely to be $1, $10, $7,500, $25,000, or $35,000. The Contestant may either hold out for whatever is within her suitcase, or may accept the Banker’s offer of $11,750 to give up her suitcase and go home. Should she keep her suitcase or keep the sure thing? Why? (I am making the assumption that the Contestant is equally likely to have selected any of these five prizes, and that the Banker does not know whether the Contestant picked a low or a high value.)

Chiaroscuro calculated what’s typically called the *mean* prize or the *expectation value* of the prize within the suitcase. “Mean” here is just the same word as the “mean” as in “average”. Imagine the Contestant playing exactly this game, picking one of five suitcases from among this set of winnings, many many times over, and add together the total winnings, then divide by the number of times the game was played. If we play enough times, that sum-of-winnings-divided-by-times-played will be the mean. “Expectation value” is another name for about the same idea and, for a wonder, is a technical term that’s maybe more self-explanatory than the common term. It’s the amount you would expect to get, on average, for each round of playing the game, if the game were played an enormous number of times.

(Technically speaking this only gets to be true if the game were played infinitely many times. Also, technically speaking, we couldn’t be positive that the average came to this expectation value, in the same way we couldn’t be positive that flipping a fair coin infinitely many times wouldn’t happen to come up tails every single time, but we can be pretty confident such a freak event wouldn’t happen.)

When we have a set of possible different outcomes — the prizes we could get — and know the probability of picking each outcome, we can find the expectation value by adding together the product of each prize times the probability of getting each prize. Here the possible outcomes are $1, $10, $7,500, $25,000, and $35,000 payoffs. The probability of getting any of them is exactly for all of them, since the Contestant is no more likely to have picked the high-value than the low-value or even the median-value prizes.

So the expectation value is, and Chiaroscuro calculated this in slightly different form, just , or a total of $13,502.20.

The Contestant, presumably able to do this sort of arithmetic in her head very quickly, might reasonably therefore conclude that she can expect a better payoff by insisting on what’s in her suitcase rather than the Banker’s offer of a measly $11,750.

If I were on the spot, by the way, having to do this in my head, I would not try doing quite this calculation. I’d know, first, that since I’m multiplying each of the prize amounts by , I could save myself the trouble of dividing over and over by five by just adding the prize amounts together and dividing that total by five. Second, I wouldn’t bother adding the $1 and the $10. They’re near enough zero dollars. I might even round $7,500 up to $10,000 just to make the calculation easier to deal with. Then I’d reason: 35,000 plus 25,000 is … actually, that’s 10,000 plus 25,000 plus 25,000, an easy 60,000 to deal with. Add 10,000 to *that* and I have a total of 70,000, which divided by five is 14,000, so, my approximated expectation value is still higher than the Banker’s offer.

If I were more fussy, I’d have got 67,500 divided by five, which is a bit of a mess; or figured that 67,500 is 70,000 minus 2,500. So 67,500 divided by 5 is equal to 70,000 divided by 5 — which was 14,000 — minus 2,500 divided by 5 — which is 500 — so that my approximated expectation value is 13,500. I could go on to make my decision without being too far wrong on the expectation value.

However, average return may not be the right way to choose. If you need $10k for life-saving surgery, then taking the sure thing is a better option because the first ten thousand dollars are worth far more than the next ten thousand.

LikeLike

That’s just the sort of thing I hoped students would point out in their answers of “why”, although I don’t remember any pointing out the circumstances of the first ten thousand being of more utility than the second.

LikeLike

Well, we do have to make the simplifying assumption that “Maximizing potential profits is the goal.” If your goal is “Take the first offer after $10,000”, that’s a different problem.

That said, going “The sure thing’s gap from the expected value is close enough, I’ll just go for the sure thing.” happens on the show regularly; and it’s part of the drive of the show.

LikeLike

Yeah, before saying what is definitely the right answer you have to figure what your goal is (isn’t that always the case?). Making the assumption that maximizing the expected payoff is reasonable enough, and it’s probably the first impulse given this problem’s set up, but it’s worth realizing that isn’t everything.

I’m confident from watching the show that the Banker’s offers are — at least after the first couple rounds, where there are too many suitcases left in play to deal with and the Contestant is asked to pick eight or so to eliminate — almost consistently

justenough less than the expected value of the payout to be an annoyingly large discrepancy, but not so much less that it’s obviously foolish to take the sure thing. So the psychologically interesting part of the game is figuring out how much less than the expected value contestants are likely to accept in order not to risk getting the worst payouts.LikeLike

I can’t help looking at this and thinking, I’ve got a 3/5 chance of getting less money than offered, and a 2/5 chance of getting more; I’d accept the offer. It’s all very well saying the expected value over many iterations is greater than the offer, but I’m only playing this once.

LikeLike

Full marks! That’s just the thing I was hoping people taking the original test question would notice and consider. (A fair number did; the “why” tipped them off, I think that the question was more than just identifying the right formula and calculating from it.)

Certainly someone playing repeatedly is better off rejecting the sure thing, but a person who only plays the once does have to consider that the $1 payoff is as likely as the $35,000 one.

LikeLike