## Reading the Comics, April 2, 2016: Keeping Me Busy Edition

After I made a little busy work for myself posting a Reading the Comics entry the other day, Comic Strip Master Command sent a rush of mathematics themes into the comics. So it goes.

Chris Browne’s Hagar the Horrible for the 31st of March happens to be funny-because-it’s-true. It’s supposed to be transgressive to see a gambler as the best mathematician available. But quite a few of the great pioneering minds of mathematics were also gamblers looking for an edge. It may shock you to learn that mathematicians in past centuries didn’t have enough money, and would look for ways to get more. And, as ever, knowing something secret about the way cards or dice or any unpredictable event might happen gives one an edge. The question of whether a 9 or a 10 is more likely to be thrown on three dice was debated for centuries, by people as familiar to us as Galileo. And by people as familiar to mathematicians as Gerolamo Cardano.

Gambling blends imperceptibly into everything people want to do. The question of how to fairly divide the pot of an interrupted game may seem sordid. But recast it as the problem of how to divide the assets of a partnership which had to halt — say, because one of the partners had to stop participating — and we have something that looks respectable. And gambling blends imperceptibly into security. The result of any one project may be unpredictable. The result of many similar ones, on average, often is. Card games or joint-stock insurance companies; the mathematics is the same. A good card-counter might be the best mathematician available.

Tony Cochran’s Agnes for the 31st name-drops Diophantine equations. It’s in the service of a student resisting class joke. Diophantine equations are equations for which we only allow integer, whole-number, answers. The name refers to Diophantus of Alexandria, who lived in the third century AD. His Arithmetica describes many methods for solving equations, a prototype to algebra as we know it in high school today. Generally, a Diophantine equation is a hard problem. It’s impossible, for example, to say whether an arbitrary Diophantine equation even has a solution. Finding what it might be is another bit of work. Fermat’s Last Theorem is a Diophantine equation, and that took centuries to work out that there isn’t generally an answer.

Mind, we can say for specific cases whether a Diophantine equation has a solution. And those specific cases can be pretty general. If we know integers a and b, then we can find integers x and y that make “ax + by = 1” true, for example.

Graham Harrop’s Ten Cats for the 31st hurts mathematicians’ feelings on the way to trying to help a shy cat. I’m amused anyway.

And Jonathan Lemon’s Rabbits Against Magic for the 1st of April mentions Fermat’s Last Theorem. The structure of the joke is fine. If we must ask an irrelevant question of the Information Desk mathematics has got plenty of good questions. The choice makes me suspect Lemon’s showing his age, though. The imagination-capturing power of Fermat’s Last Theorem as a great unknown has to have been diminished since the first proof was found over two decades ago. It’d be someone who grew up knowing there was this mystery about xn plus yn equalling zn who’d jump to this reference.

Tom Toles’s Randolph Itch, 2 am for the 2nd of April mentions “zero-sum games”. The term comes from the mathematical theory of games. The field might sound frivolous, but that’s because you don’t know how much stuff the field considers to be “games”. Mathematicians who study them consider “games” to be sets of decisions. One or more people make choices, and gain or lose as a result of those choices. That is a pretty vague description. It covers playing solitaire and multiplayer Civilization V. It also covers career planning and imperial brinksmanship. And, for that matter, business dealings.

“Zero-sum” games refer to how we score the game’s objectives. If it’s zero-sum, then anything gained by one player must be balanced by equal losses by the other player or players. For example, in a sports league’s season standings, one team’s win must balance another team’s loss. The total number of won games, across all the teams, has to equal the total number of lost games. But a game doesn’t have to be zero-sum. It’s possible to create games in which all participants gain something, or all lose something. Or where the total gained doesn’t equal the total lost. These are, imaginatively, called non-zero-sum games. They turn up often in real-world applications. Political or military strategy often is about problems in which both parties can lose. Business opportunities are often intended to see the directly involved parties benefit. This is surely why Randolph is shown reading the business pages.

## Some Cards Stuff

I’m not good at shuffling cards. I can eventually scramble up a deck tolerably well, given time, but I can’t do a good riffle shuffle. Nor can I do any of the moves that show competence at card-shuffling. That thing where people make a little semicircular arch of cards in their hands? I can’t even understand how that works, never mind do it myself.

That said, I do know how many shuffles it takes to randomize a deck of cards. I mean a standard deck of 52. It’s seven. I learned that ages ago, but never saw it proved. Best I could work out was that each shuffling would, if done perfectly, mix the upper and lower halves of the old deck. So I want what had been (say) the first and second card to be mixed up arbitrarily far from one another. One shuffle might get one or two cards between the old first and second. Two shuffles might double that; three shuffles that again, and so on. By six shuffles there could be anywhere up to 64 cards between the first and second, and that’s … surely enough, right? Then one more for good luck? It’s not rigorous but you can see where that satisfies.

The Probability Fact of the Day tweet above gives a real explanation. It links to the paper Trailing the Dovetail Shuffle to its Lair, by Dr Dave Bayer and Dr Persi Diaconis. It first appeared in the Annals of Applied Probability, 1992, Vol 2, Number 2, 294 – 313. It gives an actual proof of why seven shuffles are what’s needed.

I’m sad to admit the paper isn’t one you can jump into without mathematical training. Even the symbols may seem bizarre: it uses π not for that thing to do with circles. Instead it’s used as a variable name, the way we might use ‘x’ in ordinary algebra. In this context it stands for ‘permutation’. That’s a standard thing to do in this field of mathematics. It just looks funny if you’re coming in cold.

A permutation, here, means how you change the order of things. For example, suppose we start out with five things, which I’ll label with the letters of the alphabet. Suppose they start out in the order A B C D E. (We could use other labels, but letters are convenient.) I can apply a permutation π to this ordered list of letters. Suppose that afterwards they end up in the order C A B E D. Then the permutation π did this: it moved the first thing to the second spot. It moved the second thing to the third spot. It moved the third thing to the first spot. It moved the fourth thing to the fifth spot. It moved the fifth thing to the fourth spot. There are several ways to describe this efficiently. I could say, for example, that the permutation π is (2 3 1 5 4). (If you don’t see why that works, think about it a while.) There’s other ways to write this. We don’t need them just now.

You can chain permutations together. If we did the same swapping of order on C A B E D, we would get the list B C A D E. That’s the same list we would have gotten if we had started with A B C D E and done a different permutation once. It’s what we would get if we had done (3 1 2 4 5). We can think of this as what you get if we “multiply” π by π. Permutations, these directions of how to shuffle a list of things, can work a lot like numbers.

There’s more interesting things in here, even if you don’t follow the argument. I admit I get lost somewhere in section 3. I’m hoping someone asks me about the baker’s transformation. But it does describe some impressive-sounding magic tricks to be done with a slightly shuffled deck. And it gives this great puzzle, as well as answering it.

Suppose someone has a well-shuffled deck of cards. She deals them one at a time. You try to guess what card is coming up next. And you never make the foolish mistake of predicting a card that’s already come up. Most of the time you’ll be wrong. But at least you’ll end on a success. After 51 cards have been dealt you will call the final one right.

How many cards would you expect, on average, to call correctly, out of these 52?

## Reading the Comics, March 1, 2014: Isn’t It One-Half X Squared Plus C? Edition

So the subject line references here a mathematics joke that I never have heard anybody ever tell, and only encounter in lists of mathematics jokes. It goes like this: a couple professors are arguing at lunch about whether normal people actually learn anything about calculus. One of them says he’s so sure normal people learn calculus that even their waiter would be able to answer a basic calc question, and they make a bet on that. He goes back and finds their waiter and says, when she comes with the check he’s going to ask her if she knows what the integral of x is, and she should just say, “why, it’s one-half x squared, of course”. She agrees. He goes back and asks her what the integral of x is, and she says of course it’s one-half x squared, and he wins the bet. As he’s paid off, she says, “But excuse me, professor, isn’t it one-half x squared plus C?”

Let me explain why this is an accurately structured joke construct and must therefore be classified as funny. “The integral of x”, as the question puts it, has not just one correct answer but rather a whole collection of correct answers, which are different from one another only by the addition of a constant whole number, by convention denoted C, and the inclusion of that “plus C” denotes that whole collection. The professor was being sloppy in referring to just a single example from that collection instead of the whole set, as the waiter knew to do. You’ll see why this is relevant to today’s collection of mathematics-themed comics.

Jef Mallet’s Frazz (February 22) points out one of the grand things about mathematics, that if you follow the proper steps in a mathematical problem you get to be right, and to be extraordinarily confident in that rightness. And that’s true, although, at least to me a good part of what’s fun in mathematics is working out what the proper steps are: figuring out what the important parts of something you want to study should be, and what follows from your representation of them, and — particularly if you’re trying to represent a complicated real-world phenomenon with a model — whether you’re representing the things you find interesting in the real-world phenomenon well. So, while following the proper steps gets you an answer that is correct within the limits of whatever it is you’re doing, you still get to work out whether you’re working on the right problem, which is the real fun.

Mark Pett’s Lucky Cow (February 23, rerun) uses that ambiguous place between mathematics and physics to represent extreme smartness. The equation the physicist brings to Neil is the (time-dependent) Schrödinger Equation, describing how probability evolves in time, and the answer is correct. If Neil’s coworkers at Lucky Cow were smarter they’d realize the scam, though: while the equation is impressively scary-looking to people not in the know, a particle physicist would have about as much chance of forgetting this as of forgetting the end of “E equals m c … ”.

Hilary Price’s Rhymes With Orange (February 24) builds on the familiar infinite-monkeys metaphor, but misses an important point. Price is right that yes, an infinite number of monkeys already did create the works of Shakespeare, as a result of evolving into a species that could have a Shakespeare. But the infinite monkeys problem is about selecting letters at random, uniformly: the letter following “th” is as likely to be “q” as it is to be “e”. An evolutionary system, however, encourages the more successful combinations in each generation, and discourages the less successful: after writing “th” Shakespeare would be far more likely to put “e” and never “q”, which makes calculating the probability rather less obvious. And Shakespeare was writing with awareness that the words mean things and they must be strings of words which make reasonable sense in context, which the monkeys on typewriters would not. Shakespeare could have followed the line “to be or not to be” with many things, but one of the possibilities would never be “carport licking hammer worbnoggle mrxl 2038 donkey donkey donkey donkey donkey donkey donkey”. The typewriter monkeys are not so selective.

Dan Thompson’s Brevity (February 26) is a cute joke about a number’s fashion sense.

Mark Pett’s Lucky Cow turns up again (February 28, rerun) for the Rubik’s Cube. The tolerably fun puzzle and astoundingly bad Saturday morning cartoon of the 80s can be used to introduce abstract algebra. When you rotate the nine little cubes on the edge of a Rubik’s cube, you’re doing something which is kind of like addition. Think of what you can do with the top row of cubes: you can leave it alone, unchanged; you can rotate it one quarter-turn clockwise; you can rotate it one quarter-turn counterclockwise; you can rotate it two quarter-turns clockwise; you can rotate it two quarter-turns counterclockwise (which will result in something suspiciously similar to the two quarter-turns clockwise); you can rotate it three quarter-turns clockwise; you can rotate it three quarter-turns counterclockwise.

If you rotate the top row one quarter-turn clockwise, and then another one quarter-turn clockwise, you’ve done something equivalent to two quarter-turns clockwise. If you rotate the top row two quarter-turns clockwise, and then one quarter-turn counterclockwise, you’ve done the same as if you’d just turned it one quarter-turn clockwise and walked away. You’re doing something that looks a lot like addition, without being exactly like it. Something odd happens when you get to four quarter-turns either clockwise or counterclockwise, particularly, but it all follows clear rules that become pretty familiar when you notice how much it’s like saying four hours after 10:00 will be 2:00.

Abstract algebra marks one of the things you have to learn as a mathematics major that really changes the way you start looking at mathematics, as it really stops being about trying to solve equations of any kind. You instead start looking at how structures are put together — rotations are seen a lot, probably because they’re familiar enough you still have some physical intuition, while still having significant new aspects — and, following this trail can get for example to the parts of particle physics where you predict some exotic new subatomic particle has to exist because there’s this structure that makes sense if it does.

Jenny Campbell’s Flo and Friends (March 1) is set off with the sort of abstract question that comes to mind when you aren’t thinking about mathematics: how many five-card combinations are there in a deck of (52) cards? Ruthie offers an answer, although — as the commenters get to disputing — whether she’s right depends on what exactly you mean by a “five-card combination”. Would you say that a hand of “2 of hearts, 3 of hearts, 4 of clubs, Jack of diamonds, Queen of diamonds” is a different one to “3 of hearts, Jack of diamonds, 4 of clubs, Queen of diamonds, 2 of hearts”? If you’re playing a game in which the order of the deal doesn’t matter, you probably wouldn’t; but, what if the order does matter? (I admit I don’t offhand know a card game where you’d get five cards and the order would be important, but I don’t know many card games.)

For that matter, if you accept those two hands as the same, would you accept “2 of clubs, 3 of clubs, 4 of diamonds, Jack of spades, Queen of spades” as a different hand? The suits are different, yes, but they’re not differently structured: you’re still three cards away from a flush, and two away from a straight. Granted there are some games in which one suit is worth more than another, in which case it matters whether you had two diamonds or two spades; but if you got the two-of-clubs hand just after getting the two-of-hearts hand you’d probably be struck by how weird it was you got the same hand twice in a row. You can’t give a correct answer to the question until you’ve thought about exactly what you mean when you say two hands of cards are different.

## Disputed Cards

In one of my classes we’ve plunged into probability, which is a fun subject because suddenly there’s no complicated calculations to do — at worst, there’s long ones — but you have to be extremely careful about what calculations you do, so all that apparent simplicity gets turned into conceptual difficulty. So it’s brought to mind something a student in an earlier term told me about.

I’d given as a problem one of the standard rote probability puzzles: the chance of picking three red cards in a row from a well-shuffled and full deck. The chance of doing this depends a bit on whether you put the just-picked card back in the deck and reshuffle or not, but in either case, it’s pretty close to about one in eight. Multiple students got this exactly right, glad to say, but one spun it out into an anecdote.

The student was pretty enthusiastic about the course topics and while hanging out with a sibling mentioned this as a problem, and the solution. The sibling, however, didn’t believe it, and insisted that since there are an equal number of red and black cards there should be a one in two chance of drawing three red cards in a row out. The two disputed the subject for the whole weekend, and my student apparently rather appreciated having something novel to argue about.

I’m always delighted when a student is interested enough in a problem to mention it to anyone else, and probability puzzles often give things with real-world models simple enough to catch the imagination. But I was (and still am) surprised the question could last a whole weekend. Freaky things can happen in small sample size, but I’d be willing to bet that trying it out with a deck of cards a couple times would at least provide convincing evidence that the chance of three reds in a row wasn’t one in two.

Possibly my student wasn’t communicating the problem well; one in two would be about right for the chance of picking a third red card, after all, regardless of what the first two were. Or perhaps they didn’t have a deck of cards to try it out. I couldn’t reliably lay my hand on a deck of cards until a few weeks ago, when I bought a deck so I could demonstrate problems in class.

Also, a standard-size deck of cards is far too small for a class demonstration. I need to find a magic store and get an oversized deck of cards. I have the same problem with the dice I picked up, but there I should be able to find a giant pair of dice in an auto parts store. They’ll be fuzzy, but should express the idea of dice well enough for that.

## Illicitly Counted Coins

The past month I’ve had the joy of teaching a real, proper class again, after a hiatus of a few years. The hiatus has given me the chance to notice some things that I would do because that was the way I had done them, and made it easier to spot things that I could do differently.

To get a collection of data about which we could calculate statistics, I had everyone in the class flip a coin twenty times. Besides giving everyone something to do besides figure out which of my strange mutterings should be written down in case they turn out to be on the test, the result would give me a bunch of numbers, centered around ten, once they reported the number of heads which turned up. Counting the number of heads out of a set of coin flips is one of the traditional exercises to generate probability-and-statistics numbers.

Good examples are some of the most precious and needed things for teaching mathematics. It’s never enough to learn a formula; one needs to learn how to look at a problem, think of what one wants to know as a result of its posing, identify what one needs to get those results, and pick out which bits of information in the problem and which formulas allow the result to be found. It’s all the better if an example resembles something normal people would find to raise a plausible question. Here, we may not be all that interested in how many times a coin comes up heads or tails, but we can imagine being interested in how often something happens given a number of chances for it to happen, and how much that count of happenings can vary if we watch several different runs.