Reading the Comics, July 2, 2019: Back On Schedule Edition


I hoped I’d get a Reading the Comics post in for Tuesday, and even managed it. With this I’m all caught up to the syndicated comic strips which, last week, brought up some mathematics topic. I’m open for nominations about what to publish here Thursday. Write in quick.

Hilary Price’s Rhymes With Orange for the 30th is a struggling-student joke. And set in summer school, so the comic can be run the last day of June without standing out to its United States audience. It expresses a common anxiety, about that point when mathematics starts using letters. It superficially seems strange that this change worries students. Students surely had encountered problems where some term in an equation was replaced with a blank space and they were expected to find the missing term. This is the same work as using a letter. Still, there are important differences. First is that a blank line (box, circle, whatever) has connotations of “a thing to be filled in”. A letter seems to carry meaning in to the problem, even if it’s just “x marks the spot”. And a letter, as we use it in English, always stands for the same thing (or at least the same set of things). That ‘x’ may be 7 in one problem and 12 in another seems weird. I mean weird even by the standards of English orthography.

Summer School. Student, as the instructor writes a^2 + b^2 != c^2 on the board: 'Math isn't fair. It's numbers, numbers, numbers, then bam! It's letters.'
Hilary Price’s Rhymes With Orange for the 30th of June, 2019. Essays with some mention of Rhymes With Orange should be at this link.

A letter might represent a number whose value we wish to know; it might represent a number whose value we don’t care about. These are different ideas. We usually fall into a convention where numbers we wish to know are more likely x, y, and z, while those we don’t care about are more likely a, b, and c. But even that’s no reliable rule. And there may be several letters in a single equation. It’s one thing to have a single unknown number to deal with. To have two? Three? I don’t blame people fearing they can’t handle that.

Mark Leiknes’s Cow and Boy for the 30th has Billy and Cow pondering the Prisoner’s Dilemma. This is one of the first examples someone encounters in game theory. Game theory sounds like the most fun part of mathematics. It’s the study of situations in which there’s multiple parties following formal rules which allow for gains or losses. This is an abstract description. It means many things fit a mathematician’s idea of a game.

Billy: 'If we're ever arrested for the same crime we should never rat each other out. If we don't rat, then maybe we both go free. If we both rat, we both go to jail. If one rats, then the other goes to jail. But since we can't trust the interro --- ' Cow: 'BUT BOOGER GNOME STOLE THAT STEREO EQUIPMENT FOR HIS PIZZA BOX HOUSE!' Billy: 'YOU THINK THE COPS ARE GONNA BUY THAT?' Booger Gnome, with the stolen equipment: 'THERE'S NO @$#&* OUTLETS?!'
Mark Leiknes’s Cow and Boy rerun for the 30th of June, 2019. The comic strip is long since ended, but hasn’t quite rerun enough times for me to get tired of it. So essays featuring Cow and Boy appear this link. The gnome is a lawn gnome who came to life and … you know, this was a pretty weird comic and I understand why it didn’t make it in the newspapers. Just roll with it.

The Prisoner’s Dilemma is described well enough by Billy. It’s built on two parties, each — separately and without the ability to coordinate — having to make a choice. Both would be better off, under interrogation, to keep quiet and trust that the cops can’t get anything significant on them. But both have the temptation that if they rat out the other, they’ll get off free while their former partner gets screwed. And knowing that their partner has the same temptation. So what would be best for the two of them requires them both doing the thing that maximizes their individual risk. The implication is unsettling: everyone acting in their own best interest is supposed to produce the best possible result for society. And here, for the society of these two accused, it breaks down entirely.

Jason Poland’s Robbie and Bobby for the 1st is a rerun. I discussed it last time it appeared, in November 2016, which was before I would routinely include the strips under discussion. The strip’s built on wordplay, using the word ‘power’ in its connotations for might and for exponents.

Robbie: 'My opinion letter is really going to make a difference!' Bobby: 'More power to you, Robbie!' Robbie: 'You've been saying that a lot lately ... know what? I *do* feel more powerful! ... Ooh, an exponent!' (A '10' appears over Robbie's typewriter. Bobby grabs it.) Robbie: 'Hey! I earned that!' Bobby: 'You have no clue what I'll do with this power!' Next panel: Bobby's sleeping, with his sleep sound being 'zzzz^{10}'.
Jason Poland’s Robbie and Bobby rerun for the 1st of July, 2019. I think but am not sure that this comic strip has lapsed into eternal reruns. In any case the essays that mention some topic raised by Robbie and Bobby are at this link.

Exponents have been written as numbers in superscript following a base for a long while now. The notation developed over the 17th century. I don’t know why mathematicians settled on superscripts, as opposed to the many other ways a base and an exponent might fit together. It’s a good mnemonic to remember, say, “z raised to the 10th” is z with a raised 10. But I don’t know the etymology of “raised” in a mathematical context well enough. It’s plausible that we say “raised” because that’s what the notation suggests.

Zach Weinersmith’s Saturday Morning Breakfast Cereal for the 2nd argues for the beauty of mathematics as a use for it. It’s presented in a brutal manner, but saying brutal things to kids is a comic motif with history to it. Well, in an existentialist manner, but that gets pretty brutal quickly.

Kids: 'Will we ever use math?' Teacher: 'Of course! Life is an express train headed for oblivion city, and this proof of Pythagoras' theorem is one more pretty thing to contemplate before you pull into the station.' (The diagram is of a large square, with each leg divided into segments of length a and b; inside is a smaller square, connecting the segments within each of the outer square's edges, with the sides of this inner square length c.) Kid: 'I mean, like, will it get me a job?' Teacher: 'It got me this job conducting your express train!'
Zach Weinersmith’s Saturday Morning Breakfast Cereal for the 2nd of July, 2019. This one doesn’t appear in every Reading the Comics essay, so you can find my discussions inspired by Saturday Morning Breakfast Cereal at this link.

The proof of the Pythagorean Theorem is one of the very many known to humanity. This one is among the family of proofs that are wordless. At least nearly wordless. You can get from here to a^2 + b^2 = c^2 with very little prompting. If you do need prompting, it’s this: there are two expressions for how much area of the square with sides a-plus-b. One of these expressions uses only terms of a and b. The other expression uses terms of a, b, and c. If this doesn’t get a bit of a grin out of you, don’t worry. There’s, like, 2,037 other proofs we already know about. We might ask whether we need quite so many proofs of the Pythagorean theorem. It doesn’t seem to be under serious question most of the time.


And then a couple comic strips last week just mentioned mathematics. Morrie Turner’s Wee Pals for the 1st of July has the kids trying to understand their mathematics homework. Could have been anything. Mike Thompson’s Grand Avenue for the 5th started a sequence with the kids at Math Camp. The comic is trying quite hard to get me riled up. So far it’s been the kids agreeing that mathematics is the worst, and has left things at that. Hrmph.


Whether or not I have something for Thursday, by Sunday I should have anotherReading the Comics post. It, as well as my back catalogue of these essays, should be at this link. Thanks for worrying about me.

Advertisements

Yes, I Am Late With The Comics Posts Today


I apologize that, even though the past week was light on mathematically-themed comic strips, I didn’t have them written up by my usual Sunday posting time. It was just too busy a week, and I am still decompressing from the A to Z sequence. I’ll have them as soon as I’m able.

In the meanwhile may I share a couple of things I thought worth reading, and that have been waiting in my notes folder for the chance to highlight?

This Fermat’s Library tweet is one of those entertaining consequences of probability, multiplied by the large number of people in the world. If you flip twenty coins in a row there’s a one in 1,048,576 chance that all twenty will come up heads, or all twenty will come up tails. So about one in every million times you flip twenty coins, they all come up the same way. If the seven billion people in the world have flipped at least twenty coins in their lives, then something like seven thousand of them had the coins turn up heads every single one of those twenty times. That all seven billion people have tossed a coin seems like the biggest point to attack this trivia on. A lot of people are too young, or don’t have access to, coins. But there’s still going to be thousands who did start their coin-flipping lives with a remarkable streak.

Also back in October, so you see how long things have been circulating around here, John D Cook published an article about the World Series. Or any series contest. At least ones where the chance of each side winning don’t depend on the previous games in the series. If one side has a probability ‘p’ of winning any particular game, what’s the chance they’ll win a best-four-of-seven? What makes this a more challenging mathematics problem is that a best-of-seven series stops after one side’s won four games. So you can’t simply say it’s the chance of four wins. You need to account for four wins out of five games, out of six games, and out of seven games. Fortunately there’s a lot of old mathematics that explores just this.

The economist Brandford DeLong noticed the first write-up of the Prisoners Dilemma. This is one of the first bits of game theory that anyone learns, and it’s an important bit. It establishes that the logic of cooperatives games — any project where people have to work together — can have a terrible outcome. What makes the most sense for the individuals makes the least sense for the group. That a good outcome for everyone depends on trust, whether established through history or through constraints everyone’s agreed to respect.

And finally here’s part of a series about quick little divisibility tests. This is that trick where you tell what a number’s divisible by through adding or subtracting its (base ten) digits. Everyone who’d be reading this post knows about testing for divisibility by three or nine. Here’s some rules for also testing divisibility by eleven (which you might know), by seven (less likely), and thirteen. With a bit of practice, and awareness of some exceptional numbers, you can tell by sight whether a number smaller than a thousand is prime. Add a bit of flourish to your doing this and you can establish a reputation as a magical mathematician.

My 2018 Mathematics A To Z: Zugzwang


My final glossary term for this year’s A To Z sequence was suggested by aajohannas, who’d also suggested “randomness” and “tiling”. I don’t know of any blogs or other projects they’re behind, but if I do hear, I’ll pass them on.

Cartoon of a thinking coati (it's a raccoon-like animal from Latin America); beside him are spelled out on Scrabble titles, 'MATHEMATICS A TO Z', on a starry background. Various arithmetic symbols are constellations in the background.
Art by Thomas K Dye, creator of the web comics Newshounds, Something Happens, and Infinity Refugees. His current project is Projection Edge. And you can get Projection Edge six months ahead of public publication by subscribing to his Patreon. And he’s on Twitter as @Newshoundscomic.

Zugzwang.

Some areas of mathematics struggle against the question, “So what is this useful for?” As though usefulness were a particular merit — or demerit — for a field of human study. Most mathematics fields discover some use, though, even if it takes centuries. Others are born useful. Probability, for example. Statistics. Know what the fields are and you know why they’re valuable.

Game theory is another of these. The subject, as often happens, we can trace back centuries. Usually as the study of some particular game. Occasionally in the study of some political science problem. But game theory developed a particular identity in the early 20th century. Some of this from set theory experts. Some from probability experts. Some from John von Neumann, because it was the 20th century and all that. Calling it “game theory” explains why anyone might like to study it. Who doesn’t like playing games? Who, studying a game, doesn’t want to play it better?

But why it might be interesting is different from why it might be important. Think of what a game is. It is a string of choices made by one or more parties. The point of the choices is to achieve some goal. Put that way you realize: this is everything. All life is making choices, all in the pursuit of some goal, even if that goal is just “not end up any worse off”. I don’t know that the earliest researchers in game theory as a field realized what a powerful subject they had touched on. But by the 1950s they were doing serious work in strategic planning, and by 1964 were even giving us Stanley Kubrick movies.

This is taking me away from my glossary term. The field of games is enormous. If we narrow the field some we can discuss specific kinds of games. And say more involved things about these games. So first we’ll limit things by thinking only of sequential games. These are ones where there are a set number of players, and they take turns making choices. I’m not sure whether the field expects the order of play to be the same every time. My understanding is that much of the focus is on two-player games. What’s important is that at any one step there’s only one party making a choice.

The other thing narrowing the field is to think of information. There are many things that can affect the state of the game. Some of them might be obvious, like where the pieces are on the game board. Or how much money a player has. We’re used to that. But there can be hidden information. A player might conceal some game money so as to make other players underestimate her resources. Many card games have one or more cards concealed from the other players. There can be information unknown to any party. No one can make a useful prediction what the next throw of the game dice will be. Or what the next event card will be.

But there are games where there’s none of this ambiguity. These are called games with “perfect information”. In them all the players know the past moves every player has made. Or at least should know them. Players are allowed to forget what they ought to know.

There’s a separate but similar-sounding idea called “complete information”. In a game with complete information, players know everything that affects the gameplay. At least, probably, apart from what their opponents intend to do. This might sound like an impossibly high standard, at first. All games with shuffled decks of cards and with dice to roll are out. There’s no concealing or lying about the state of affairs.

Set complete-information aside; we don’t need it here. Think only of perfect-information games. What are they? Some ancient games, certainly. Tic-tac-toe, for example. Some more modern versions, like Connect Four and its variations. Some that are actually deep, like checkers and chess and go. Some that are, arguably, more puzzles than games, as in sudoku. Some that hardly seem like games, like several people agreeing how to cut a cake fairly. Some that seem like tests to prove people are fundamentally stupid, like when you auction off a dollar. (The rules are set so players can easily end up paying more then a dollar.) But that’s enough for me, at least. You can see there are games of clear, tangible interest here.

The last restriction: think only of two-player games. Or at least two parties. Any of these two-party sequential games with perfect information are a part of “combinatorial game theory”. It doesn’t usually allow for incomplete-information games. But at least the MathWorld glossary doesn’t demand they be ruled out. So I will defer to this authority. I’m not sure how the name “combinatorial” got attached to this kind of game. My guess is that it seems like you should be able to list all the possible combinations of legal moves. That number may be enormous, as chess and go players are always going on about. But you could imagine a vast book which lists every possible game. If your friend ever challenged you to a game of chess the two of you could simply agree, oh, you’ll play game number 2,038,940,949,172 and then look up to see who won. Quite the time-saver.

Most games don’t have such a book, though. Players have to act on what they understand of the current state, and what they think the other player will do. This is where we get strategies from. Not just what we plan to do, but what we imagine the other party plans to do. When working out a strategy we often expect the other party to play perfectly. That is, to make no mistakes, to not do anything that worsens their position. Or that reduces their chance of winning.

… And yes, arguably, the word “chance” doesn’t belong there. These are games where the rules are known, every past move is known, every future move is in principle computable. And if we suppose everyone is making the best possible move then we can imagine forecasting the whole future of the game. One player has a “chance” of winning in the same way Christmas day of the year 2038 has a “chance” of being on a Tuesday. That is, the probability is just an expression of our ignorance, that we don’t happen to be able to look it up.

But what choice do we have? I’ve never seen a reference that lists all the possible games of tic-tac-toe. And that’s about the simplest combinatorial-game-theory game anyone might actually play. What’s possible is to look at the current state of the game. And evaluate which player seems to be closer to her goal. And then look at all the possible moves.

There are three things a move can do. It can put the party closer to the goal. It can put the party farther from the goal. Or it can do neither. On her turn the other party might do something that moves you farther from your goal, moves you closer to your goal, or doesn’t affect your status at all. It seems like this makes strategy obvious. On every step take the available move that takes one closest to the goal. This is known as a “greedy” strategy. As the name suggests it isn’t automatically bad. If you expect the game to be a short one, greed might be the best approach. The catch is that moves that seem less good — even ones that seem to hurt you initially — might set up other, even better moves. So strategy requires some thinking beyond the current step. Properly, it requires thinking through to the end of the game. Or at least until the end of the game seems obvious.

We should like a strategy that leaves us no choice but to win. Next-best would be one that leaves the game undecided, since something might happen like the other player needing to catch a bus and so resigning. This is how I got my solitary win in the two months I spent in the college chess club. Worst would be the games that leave us no choice but to lose.

It can be that there are no good moves. That is, that every move available makes it a little less likely that we win. Sometimes a game offers the chance to pass, preserving the state of the game but giving the other party the turn. Then maybe the other party will do something that creates a better opportunity for us. But if we are allowed to pass, there’s a good chance the game lets the other party pass, too, and we end up in the same fix. And it may be the rules of the game don’t allow passing anyway. One must move.

The phenomenon of having to make a move when it’s impossible to make a good move has prominence in chess. I don’t have the chess knowledge to say how common the situation is. But it seems to be a situation people who study chess problems love. I suppose it appeals to a love of lost causes and the hope that you can be brilliant enough to see what everyone else has overlooked. German chess literate gave it a name 160 years ago, “zugzwang”, “compulsion to move”. Somehow I never encountered the term when I was briefly a college chess player. Perhaps because I was never in zugzwang and was just too incompetent a player to find my good moves. I first encountered the term in Michael Chabon’s The Yiddish Policeman’s Union. The protagonist picked up on the term as he investigated the murder of a chess player and then felt himself in one.

Combinatorial game theorists have picked up the word, and sharpened its meaning. If I understand correctly chess players allow the term to be used for any case where a player hurts her position by moving at all. Game theorists make it more dire. This may reflect their knowledge that an optimal strategy might require taking some dismal steps along the way. The game theorist formally grants the term only to the situation where the compulsion to move changes what should be a win into a loss. This seems terrible, but then, we’ve all done this in play. We all feel terrible about it.

I’d like here to give examples. But in searching the web I can find only either courses in game theory. These are a bit too much for even me to sumarize. Or chess problems, which I’m not up to understanding. It seems hard to set out an example: I need to not just set out the game, but show that what had been a win is now, by any available move, turned into a loss. Chess is looser. It even allows, I discover, a double zugzwang, where both players are at a disadvantage if they have to move.

It’s a quite relatable problem. You see why game theory has this reputation as mathematics that touches all life.


And with that … I am done! All of the Fall 2018 Mathematics A To Z posts should be at this link. Next week I’ll post my big list of all the letters, though. And, as has become tradition, a post about what I learned by doing this project. And sometime before then I should have at least one more Reading the Comics post. Thanks kindly for reading and we’ll see when in 2019 I feel up to doing another of these.

Reading the Comics, April 2, 2016: Keeping Me Busy Edition


After I made a little busy work for myself posting a Reading the Comics entry the other day, Comic Strip Master Command sent a rush of mathematics themes into the comics. So it goes.

Chris Browne’s Hagar the Horrible for the 31st of March happens to be funny-because-it’s-true. It’s supposed to be transgressive to see a gambler as the best mathematician available. But quite a few of the great pioneering minds of mathematics were also gamblers looking for an edge. It may shock you to learn that mathematicians in past centuries didn’t have enough money, and would look for ways to get more. And, as ever, knowing something secret about the way cards or dice or any unpredictable event might happen gives one an edge. The question of whether a 9 or a 10 is more likely to be thrown on three dice was debated for centuries, by people as familiar to us as Galileo. And by people as familiar to mathematicians as Gerolamo Cardano.

Hagar: 'I brought a math tutor for Hamlet!' Helga: 'Great! Does he know his stuff?' Hagar: 'Are you kidding? He's the BEST card counter in the kingdom!'
It’s funny because this it’s anachronistic for Blaise Pascal to be in this setting.

Gambling blends imperceptibly into everything people want to do. The question of how to fairly divide the pot of an interrupted game may seem sordid. But recast it as the problem of how to divide the assets of a partnership which had to halt — say, because one of the partners had to stop participating — and we have something that looks respectable. And gambling blends imperceptibly into security. The result of any one project may be unpredictable. The result of many similar ones, on average, often is. Card games or joint-stock insurance companies; the mathematics is the same. A good card-counter might be the best mathematician available.

Tony Cochran’s Agnes for the 31st name-drops Diophantine equations. It’s in the service of a student resisting class joke. Diophantine equations are equations for which we only allow integer, whole-number, answers. The name refers to Diophantus of Alexandria, who lived in the third century AD. His Arithmetica describes many methods for solving equations, a prototype to algebra as we know it in high school today. Generally, a Diophantine equation is a hard problem. It’s impossible, for example, to say whether an arbitrary Diophantine equation even has a solution. Finding what it might be is another bit of work. Fermat’s Last Theorem is a Diophantine equation, and that took centuries to work out that there isn’t generally an answer.

Mind, we can say for specific cases whether a Diophantine equation has a solution. And those specific cases can be pretty general. If we know integers a and b, then we can find integers x and y that make “ax + by = 1” true, for example.

Graham Harrop’s Ten Cats for the 31st hurts mathematicians’ feelings on the way to trying to help a shy cat. I’m amused anyway.

And Jonathan Lemon’s Rabbits Against Magic for the 1st of April mentions Fermat’s Last Theorem. The structure of the joke is fine. If we must ask an irrelevant question of the Information Desk mathematics has got plenty of good questions. The choice makes me suspect Lemon’s showing his age, though. The imagination-capturing power of Fermat’s Last Theorem as a great unknown has to have been diminished since the first proof was found over two decades ago. It’d be someone who grew up knowing there was this mystery about xn plus yn equalling zn who’d jump to this reference.

Tom Toles’s Randolph Itch, 2 am for the 2nd of April mentions “zero-sum games”. The term comes from the mathematical theory of games. The field might sound frivolous, but that’s because you don’t know how much stuff the field considers to be “games”. Mathematicians who study them consider “games” to be sets of decisions. One or more people make choices, and gain or lose as a result of those choices. That is a pretty vague description. It covers playing solitaire and multiplayer Civilization V. It also covers career planning and imperial brinksmanship. And, for that matter, business dealings.

“Zero-sum” games refer to how we score the game’s objectives. If it’s zero-sum, then anything gained by one player must be balanced by equal losses by the other player or players. For example, in a sports league’s season standings, one team’s win must balance another team’s loss. The total number of won games, across all the teams, has to equal the total number of lost games. But a game doesn’t have to be zero-sum. It’s possible to create games in which all participants gain something, or all lose something. Or where the total gained doesn’t equal the total lost. These are, imaginatively, called non-zero-sum games. They turn up often in real-world applications. Political or military strategy often is about problems in which both parties can lose. Business opportunities are often intended to see the directly involved parties benefit. This is surely why Randolph is shown reading the business pages.

Reblog: Parrondo’s Paradox


Ad Nihil here presents an interesting-looking game demonstrating something I hadn’t heard of before, Parrondo’s Paradox, which apparently is a phenomenon in which a combination of losing strategies becomes a winning strategy. I do want to think about this more, so I offer the link to that blog’s report so that I hopefully will go back and consider it more when I’m able.

ad nihil

My inspiration with my daughter’s 8th grade probability problems continues. In a previous post I worked on a hypothetical story of monitoring all communications for security with a Bayesian analysis approach. This time when I saw the spinning wheel problems in her text book, I was yet again inspired to create a game system to demonstrate Parrondo’s Paradox.

“Parrondo’s paradox, a paradox in game theory, has been described as: A combination of losing strategies becomes a winning strategy. It is named after its creator, Juan Parrondo, who discovered the paradox in 1996.” – Wikipedia.org

Simply put, with certain (not all) combinations, you may create an overall winning strategy by playing different losing scenarios alternatively in the long run. Here’s the game system I came up with this (simpler than the original I believe):

Let’s imagine a spinning wheel like below, divided into eight equal parts with 6 parts red…

View original post 230 more words